Tuesday, November 16, 2010

[Virtuemart] Customize Order List - adding extra column

In this post, I am going explain on how to add an extra column in Virtuemart Administrator Order List page.

(Note that, in this example, I'm going to add a new address_1 column).

Note: Please refer to my previous post on "adding search keywords" for the SQL query code for the other fields that you want.

Getting Started.

In /administrator/components/com_virtuemart/html/order.order_list.php

locate this section, around line 27. Then add "address_1" into the line, as below.

//$list .= "first_name, last_name FROM #__{vm}_orders, #__{vm}_order_user_info WHERE ";

// This is the modified line.

$list .= "first_name, last_name, address_1 FROM #__{vm}_orders, #__{vm}_order_user_info WHERE ";


After that, locate this section, as usual and note that "Address_1" is already the added column.

$columns = Array( "#" => "width=\"20\"",

"<input type=\"checkbox\" name=\"toggle\" value=\"\" onclick=\"checkAll(".$checklimit.")\" />" => "width=\"20\"",

$VM_LANG->_('PHPSHOP_ORDER_LIST_ID') => '',

$VM_LANG->_('PHPSHOP_ORDER_PRINT_NAME') => '',

// This is you "address_1" column header,

'Address_1' => '',

$VM_LANG->_('PHPSHOP_ORDER_LIST_PRINT_LABEL') => '',

$VM_LANG->_('PHPSHOP_ORDER_LIST_TRACK') => '',

$VM_LANG->_('PHPSHOP_ORDER_LIST_VOID_LABEL') => '',

$VM_LANG->_('PHPSHOP_CHECK_OUT_THANK_YOU_PRINT_VIEW') => '',

$VM_LANG->_('PHPSHOP_ORDER_LIST_CDATE') => '',

$VM_LANG->_('PHPSHOP_ORDER_LIST_MDATE') => '',

$VM_LANG->_('PHPSHOP_ORDER_LIST_STATUS') => '',

$VM_LANG->_('PHPSHOP_UPDATE') => '',

$VM_LANG->_('PHPSHOP_ORDER_LIST_TOTAL') => '',

$VM_LANG->_('E_REMOVE') => "width=\"5%\""

);

$listObj->writeTableHeader( $columns );


As you can see, those are the column headers for the table, and the last line is the function to print out all the column headers.

Then, you will need to locate this section of code, and add $db->f('address_1'); as below:

$tmp_cell = $db->f('first_name').' '.$db->f('last_name');

if( $perm->check('admin') && defined('_VM_IS_BACKEND')) {

$url = $_SERVER['PHP_SELF']."?page=admin.user_form&amp;user_id=". $db->f("user_id");

$tmp_cell = '<a href="'.$sess->url( $url ).'">'.$tmp_cell.'</a>';

}


$listObj->addCell( $tmp_cell );

// This will print "address_1" column, which is placed afer the "Name" column

$tmp_cell = $db->f('address_1');

$listObj->addCell( $tmp_cell );


Note: The ordering of the the print cell function (i.e. the $listObj->addCell( $tmp_cell ); ) is important, as you wouldn't want to jumble up the different fields.Thats all.

Hope this help :)

You can find the original post here : http://blog.gbinghan.com/2010/10/virtuemart-customize-order-list-adding.html

How to fix xhtml and css validation errors in Virtuemart

There are essentially two issues causing the xhtml & css not to validate.

How to fix the css validation


The css validation error is due to the use of text-align:top in the “Featured Products” block on the shop landing page. The issue being that “top” is not a valid option for the text-align property. Valid options for text-align are:

  • left

  • center

  • right

  • justify

  • inherit


The correct usage to set the vertical alignment is… yup, you guessed it “vertical-align“.

(Read more about the differences between css vertical-align property and css text-align property)

The file that needs to be updated in Virtuemart is located in:
[site-root] > components > com_virtuemart > themes > default > templates > common > featuredProducts.tpl.php

Line 15, change from this:
<div style="float:left;width:<?php echo $cellwidth ?>%;
text-align:top; padding:0px;" >

To this:
<div style="float:left;width:<?php echo $cellwidth ?>%;
vertical-align:top; padding:0;" >

How to fix the xhtml validation


The second problem causing the xhtml validation to fail was also in the Featured Products file. This time caused by the product title h4, a block level element, being wrapped by a link, which is an inline element.

This can be fixed in two ways:

The first is to reverse the nesting, so the link is inside of the heading.
Line 16, change from this:

<a title="<?php echo $featured["product_name"] ?>" href="<?php
$sess->purl(URL."index.php?option=com_virtuemart&amp;page=shop.
product_details&amp;flypage=".$featured["flypage"]."&amp;product_id=
".$featured["product_id"]) ?>">

To this:

<h4><a title="<?php echo $featured["product_name"] ?>"
href="<?php $sess->purl(URL."index.php?option=com_virtuemart&
amp;page=shop.product_details&amp;flypage=".$featured["flypage"]."
&amp;product_id=".$featured["product_id"]) ?>">

The second option is to change the product title from an h4 to an inline element, such as a span. I personally prefer the first method, as using a heading brings more structure to the page.

How to fix the final xhtml validation errors


The last few xhtml validation errors we caused by the use of ampersands (ie, &) in the stores categories, for example Laptops & Accessories. I tried using the full entity code (ie, &amp;) but the error prevailed.

The quick and easy fix was to simply use the full word ‘and’ instead. This isn’t the most ideal fix, but it works. A better solution would be to add a simple string replace function to replace all instances of & with the full entity code, like Joomla! does with its menus.

Hope this comes in handy in your next Virtuemart project!



You c an find original post here : http://www.prothemer.com/blog/tips-and-tricks/virtuemart-xhtml-css-validation-errors/

WordPress MU for Joomla! from corePHP


Continuing with our movement to truly collaborate with as many Joomla developers as possible, we were sent a test copy of corePHP's WordPress MU for Joomla! For the unacquainted, this is a full version of WordPress INSIDE Joomla! for those who want the best blog software inside the most versatile CMS.


Installation
I tested on a local MAMP install and the install process was timing out, so I checked the filesize - 2.5mb :) That'll do it. This makes sense, since the component contains WordPress (duh). So I just unpacked in the tmp directory and installed from a directory and bam, the install worked in a snap.

At the end of the installed, you're instructed to move a couple of key files for WordPress to work properly. I wish there was a link, but that's just because I'm lazy. I like that it wasn't automated, because I like to know when files are being moved into these particular directories.

I think clicked the link to Start Blogging, and pow, I'm in the WordPress admin. I fully expected this, but it's still crazy at first. As a side note and time for a plug, using our AdminPraise2 Joomla! admin template with the WordPress inspired theme would really provide a seamless admin experience.


First Impressions
I clicked the link to preview my blog, and was taken to my Joomla! site, with a fully functional, fully integrated WordPress blog, and I'm severely impressed. I created a Joomla! menu item, where you can enter the ID of which blog you'd like displayed (since you can have multiple in WordPress MU), and there's my site's blog!

You can quickly add a new blog for a new user, as the Joomla! users are synced with WordPress.


Additional Extensions
corePHP gives you an entire slew of modules including tags, latest, and several others. They also included every native WordPress plugin you could possibly need. Lastly, they've included JomSocial plugins that will give your community the ability to have their own WordPress powered blog, sweet! I didn't get to test the JomSocial plugins, but I'm sure it shows latest blog activity on the users' JomSocial profile, and adds blog activity into the JomSocial activity stream.


Issues
I can't give a perfect review of anything, it's just not genuine (or in my nature). The only conflict I found was that the frontend layout for WordPress MU for Joomla! shares some common IDs used in Joomla! templates, such as "content" and "navigation". These are commonly used in any layout from any CMS, so I'm sure they're included as they would be in any WordPress layout, but to work well inside Joomla! they'd need names like "wp-content" and "wp-navigation". This would take 2 seconds for the end user to fix, so it's certainly not a show-stopper, but I could see a customer getting stumped and possibly confused for a moment.SuperBlog
With about 5 minutes of CSS love, I was able to get WordPress MU by corePHP working with our SuperBlog template. Not bad if I do say so myself.


Pricing
$79.95 for Single Blog, $99.95 for MU (as tested), $420.95 for 6 Site License


Rating

  • Best blog software inside the best CMS
  • Superb integration (frontend and backend)
  • All the extensions you'll need
  • Price point may be high for some
  • Possible (tiny) template conflicts

Update - New WordPress MU Quick Icons Admin Module!
We enjoyed our testing so much, we've gone ahead and created a new WP Icons admin module, with quicklinks to all the pages in the WordPress admin. Just like the AP Icons in AdminPraise2, it contains all the links with the ability to hide, or display to Manager & above, Administrator & above, or only Super Admins. Look for lots more of these types of modules from us soon!

Here's a couple of screenshots of WP Icons in AdminPraise2 (WordPress theme) and the default admin template:





You can find original post here : http://www.joomlapraise.com/blog/item/41-wordpress-mu-for-joomla-from-corephp

Monday, November 15, 2010

How CSS and JavaScript Are Different

So, what's this important difference?

In CSS, style rules are automatically applied to any element that matches the selectors, no matter when those elements are added to the document (DOM).

In JavaScript, event handlers that are registered for elements in the document apply only to those elements that are part of the DOM at the time the event is attached. If we add similar elements to the DOM at a later time, whether through simple DOM manipulation or ajax, CSS will give those elements the same appearance, but JavaScript will not automatically make them act the same way.


For example, let's say we have "<button>Alert!</button>" in our document, and we want to attach a click handler to it that generates an alert message. In jQuery, we might do so with the following code:
PLAIN TEXT
JavaScript:
  1. $(document).ready(function() {
  2. $('button.alert').click(function() {
  3. alert('this is an alert message');
  4. });
  5. });
Here we are registering the click handler for the button with a class of "alert" as soon as the DOM has loaded. So, the button is there, and we have a click function bound to it. If we add a second <button> later on, however, it will know nothing about that click handler. The click event had been dealt with before this second button existed. So, the second button will not generate an alert.

Let's test what we've just discussed. I've added a script with the above three lines of jQuery code so that the following button will produce an alert message when clicked. Try it:


Events Don't Work with Added Elements

Now, let's create a new button (if we don't already have a second one) using jQuery code like this:
PLAIN TEXT


JavaScript:

  1. $('#create-button').click(function() {
  2. if ( $('button.alert').length <2) {
  3. $('<button>Not another alert').insertAfter(this);
  4. }
  5. return false;
  6. });
create the button

Have you clicked the link to create the second button? Great. Now click that button. It does nothing. Just as expected.

CSS Continues to "Work" with Newly Created Elements


Now let's take a look at another example. In this one, we have three list items—two plain items and one with a class of special:
PLAIN TEXT


HTML:

  1. <ul id="list1" class="eventlist">
  2. <li>plain</li>
  3. <li class="special">special <button>I am special</button></li>
  4. <li>plain</li>
  5. </ul>

Press the "I am special" button to create a new list item with a class of "special":
  • plain
  • special
  • plain


Notice that, like the first special li, the new one has the yellow background. The CSS has come through for us. But press the newly created "I am new" button and, just as with the second alert above, nothing happens. The jQuery code we're using to add the new item says that upon clicking a button inside a list item with a class of "special" (which itself is inside an element with id of "list1") a new list item with class="special" should be inserted after the list item in which the button was clicked:
PLAIN TEXT


JavaScript:
  1. $(document).ready(function() {
  2. $('#list1 li.special button').click(function() {
  3. var $newLi = $('<li>special and new <button>I am new</button></li>');
  4. $(this).parent().after($newLi);
  5. });
  6. });

So, how can we get the events to carry over to the new elements? Two common approaches are event delegation and "re-binding" event handlers. In this entry, we'll examine event delegation; in part 2, we'll explore ways to re-bind.

Event Delegation: Getting Events to Embrace New Elements


The general idea of event delegation is to bind the event handler to a containing element and then have an action take place based on which specific element within that containing element is targeted. Let's say we have another unordered list: <ul id="list2"> ... </ul>. Instead of attaching the .click() method to a button — $('#list2 li.special button').click(...) — we can attach it to the entire surrounding <ul>. Through the magic of "bubbling," any click on the button is also a click on the button's surrounding list item, the list as a whole, the containing div, and all the way up to the window object. Since the <ul> that gets clicked is the same one each time (we're only creating items within the <ul>), the same thing will happen when clicking on all of the buttons, regardless of when they were created.

When we use event delegation, we need to pass in the "event" argument. So, in our case, instead of .click(), we'll have .click(event). We don't have to name this argument event. We can call it e or evt or gummy or whatever we want. I just like to use labels that are as obvious as possible because I have a hard time keeping track of things. Here is what we have so far:
PLAIN TEXT


JavaScript:

  1. $(document).ready(function() {
  2. $('#list2').click(function(event) {
  3. var $newLi = $('<li>special and new <button>I am new</button></li>');
  4. });
  5. });
So far, the code is very similar to our first attempt, except for the selector we're starting with (#list2) and the addition of the event argument. Now we need to determine whether what is being clicked inside the <ul> is a "special" button or not. If it is, we can add a new <li>. We check the clicked element by using the "target" property of the event argument:
PLAIN TEXT


JavaScript:

  1. $(document).ready(function() {
  2. $('#list2').click(function(event) {
  3. var $newLi = $('<li>special and new <button>I am new</button></li>');
  4. var $tgt = $(event.target);
  5. if ($tgt.is('button')) {
  6. $tgt.parent().after($newLi);
  7. }
  8. // next 2 lines show that you've clicked on the ul
  9. var bgc = $(this).css('backgroundColor');
  10. $(this).css({backgroundColor: bgc == '#ffcccc' || bgc == 'rgb(255, 204, 204)' ? '#ccccff' : '#ffcccc'});
  11. });
  12. });
Line 4 above puts the target element in a jQuery wrapper and stores it in the $tgt variable. Line 5 checks whether the click's target is a button. If it is, the new list item is inserted after the parent of the clicked button. Let's try it:

  • plain
  • special
  • plain

It's probably worth noting that jQuery makes working with the event argument cross-browser friendly. If you do this sort of thing with plain JavaScript and DOM nodes, you'd have to do something like this:
var list2 = document.getElementById('list2');
list2.onclick = function(e) {
var e = e || window.event;
var tgt = e.target || e.srcElement;
if (tgt.nodeName.toLowerCase() == 'button') {
// do something
}
};

As you can see, it's a bit of a hassle.

Another Huge Benefit of Event Delegation


Event delegation is also a great way to avoid crippling the user's browser when you're working with a huge document. For example, if you have a table with thousands of cells, and you want something to happen when the user clicks on one, you won't want to attach a click handler to every single one of them (believe me, it can get ugly). Instead, you can attach the click handler to a single table element and use event.target to pinpoint the cell that is being clicked:
PLAIN TEXT


JavaScript:



  1. $(document).ready(function() {
  2. $('table').click(function(event) {
  3. var $thisCell, $tgt = $(event.target);
  4. if ($tgt.is('td')) {
  5. $thisCell = $tgt;
  6. } else if ($tgt.parents('td').length) {
  7. $thisCell = $tgt.parents('td:first');
  8. }
  9. // now do something with $thisCell
  10. });
  11. });

Note that I had to account for the possibility of clicking in a child/descendant of a table cell, but this seems a small inconvenience for the great performance increase that event delegation affords.

you can find original post here : http://www.learningjquery.com/2008/03/working-with-events-part-1

Loading external content with Ajax using jQuery and YQL

Ajax with jQuery is very easy to do – like most solutions it is a few lines:
$(document).ready(function(){
$('.ajaxtrigger').click(function(){
$('#target').load('ajaxcontent.html');
});
});

Check out this simple and obtrusive Ajax demo to see what it does.

This will turn all elements with the class of ajaxtrigger into triggers to load “ajaxcontent.html” and display its contents in the element with the ID target.

This is terrible, as it most of the time means that people will use pointless links like <a href="#">click me</a>, but this is not the problem for today. I am working on a larger article with all the goodies about Ajax usability and accessibility.

However, to make this more re-usable we could do the following:
$(document).ready(function(){
$('.ajaxtrigger').click(function(){
$('#target').load($(this).attr('href'));
return false;
});
});

You can then use <a href="ajaxcontent.html">load some content</a> to load the content and you make the whole thing re-usable.

Check out this more reusable Ajax demo to see what it does.

The issue I wanted to find a nice solution for is the one that happens when you click on the second link in the demo: loading external files fails as Ajax doesn’t allow for cross-domain loading of content. This means that <a href="http://icant.co.uk/">see my portfolio</a> will fail to load the Ajax content and fail silently at that. You can click the link until you are blue in the face but nothing happens. A dirty hack to avoid this is just allowing the browser to load the document if somebody really tries to load an external link.

Check out this allowing external links to be followed to see what it does.
$(document).ready(function(){
$('.ajaxtrigger').click(function(){
var url = $(this).attr('href');
if(url.match('^http')){
return true;
} else {
$('#target').load(url);
return false;
}
});
});

Proxying with PHP


If you look around the web you will find the solution in most of the cases to be PHP proxy scripts (or any other language). Something using cURL could be for example proxy.php:
<?php
$url = $_GET['url'];
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
echo $content;
?>

People then could use this with a slightly changed script (using a proxy):
$(document).ready(function(){
$('.ajaxtrigger').click(function(){
var url = $(this).attr('href');
if(url.match('^http')){
url = 'proxy.php?url=' + url;
}
$('#target').load(url);
return false;
});
});

It is also a spectacularly stupid idea to have a proxy script like that. The reason is that without filtering people can use this to load any document of your server and display it in the page (simply use firebug to rename the link to show anything on your server), they can use it to inject a mass-mailer script into your document or simply use this to redirect to any other web resource and make it look like your server was the one that sent it. It is spammer’s heaven.

Use a white-listing and filtering proxy!


So if you want to use a proxy, make sure to white-list the allowed URIs. Furthermore it is a good plan to get rid of everything but the body of the other HTML document. Another good idea is to filter out scripts. This prevents display glitches and scripts you don’t want executed on your site to get executed.

Something like this:
<?php
$url = $_GET['url'];
$allowedurls = array(
'http://developer.yahoo.com',
'http://icant.co.uk'
);
if(in_array($url,$allowedurls)){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
$content = preg_replace('/.*<body[^>]*>/msi','',$output);
$content = preg_replace('/</body>.*/msi','',$content);
$content = preg_replace('/<?/body[^>]*>/msi','',$content);
$content = preg_replace('/[r|n]+/msi','',$content);
$content = preg_replace('/<--[Ss]*?-->/msi','',$content);
$content = preg_replace('/<noscript[^>]*>[Ss]*?</noscript>/msi',
'',$content);
$content = preg_replace('/<script[^>]*>[Ss]*?</script>/msi',
'',$content);
$content = preg_replace('/<script.*/>/msi','',$content);
echo $content;
} else {
echo 'Error: URL not allowed to load here.';
}
?>

Pure JavaScript solution using YQL


But what if you have no server access or you want to stay in JavaScript? Not to worry – it can be done. YQL allows you to load any HTML document and get it back in JSON. As jQuery has a nice interface to load JSON, this can be used together to achieve what we want to.

Getting HTML from YQL is as easy as using:
select * from html where url="http://icant.co.uk"

YQL does a few things extra for us:

  • It loads the HTML document and sanitizes it

  • It runs the HTML document through HTML Tidy to remove things .NETnasty frameworks considered markup.

  • It caches the HTML for a while

  • It only returns the body content of the HTML - so no styling (other than inline styles) will get through.


As output formats you can choose XML or JSON. If you define a callback parameter for JSON you get JSON-P with all the HTML as a JavaScript Object – not fun to re-assemble:
foo({
"query":{
<a href=""1" title="">count</a>",
<a href=""2010-01-10T07:51:43Z" title="">created</a>",
<a href=""en-US" title="">lang</a>",
<a href=""2010-01-10T07:51:43Z" title="">updated</a>",
<a href=""http://query.yahoo[...whatever...]k%22" title="">uri</a>",
"results":{
"body":{
"div":{
<a href=""doc2" title="">id</a>",
<a href="[{"id":"hd" title="">div</a>",
<a href=""icant.co.uk" title="">h1</a> - everything Christian Heilmann"
},
{<a href=""bd" title="">id</a>",
"div":[
{<a href="[{"h2":"About" title="">div</a> this and me","[... and so on...]
}}}}}}}});

When you define a callback with the XML output you get a function call with the HTML data as string in an Array – much easier:
foo({
"query":{
<a href=""1" title="">count</a>",
<a href=""2010-01-10T07:47:40Z" title="">created</a>",
<a href=""en-US" title="">lang</a>",
<a href=""2010-01-10T07:47:40Z" title="">updated</a>",
<a href=""http://query.y[...who" title="">uri</a> cares...]%22"},
"results":[
"<body>n <div id="doc2">n <div id="hd">n
<h1>icant.co.uk - everything Christian Heilmann</h1>n
... and so on ..."
]
});

Using jQuery’s getJSON() method and accessing the YQL endpoint this is easy to implement:
$.getJSON("http://query.yahooapis.com/v1/public/yql?"+
"q=select%20*%20from%20html%20where%20url%3D%22"+
encodeURIComponent(url)+
"%22&format=xml'&callback=?",
function(data){
if(data.results[0]){
var data = filterData(data.results[0]);
container.html(data);
} else {
var errormsg = '<p>Error: could not load the page.</p>';
container.html(errormsg);
}
}
);

Putting it all together you have a cross-domain Ajax solution with jQuery and YQL:
$(document).ready(function(){
var container = $('#target');
$('.ajaxtrigger').click(function(){
doAjax($(this).attr('href'));
return false;
});
function doAjax(url){
// if it is an external URI
if(url.match('^http')){
// call YQL
$.getJSON("http://query.yahooapis.com/v1/public/yql?"+
"q=select%20*%20from%20html%20where%20url%3D%22"+
encodeURIComponent(url)+
"%22&format=xml'&callback=?",
// this function gets the data from the successful
// JSON-P call
function(data){
// if there is data, filter it and render it out
if(data.results[0]){
var data = filterData(data.results[0]);
container.html(data);
// otherwise tell the world that something went wrong
} else {
var errormsg = '<p>Error: could not load the page.</p>';
container.html(errormsg);
}
}
);
// if it is not an external URI, use Ajax load()
} else {
$('#target').load(url);
}
}
// filter out some nasties
function filterData(data){
data = data.replace(/<?/body[^>]*>/g,'');
data = data.replace(/[r|n]+/g,'');
data = data.replace(/<--[Ss]*?-->/g,'');
data = data.replace(/<noscript[^>]*>[Ss]*?</noscript>/g,'');
data = data.replace(/<script[^>]*>[Ss]*?</script>/g,'');
data = data.replace(/<script.*/>/,'');
return data;
}
});

This is rough and ready of course. A real Ajax solution should also consider timeout and not found scenarios. Check out the full version with loading indicators, error handling and yellow fade for inspiration.

you can find original post here : http://www.wait-till-i.com/2010/01/10/loading-external-content-with-ajax-using-jquery-and-yql/

How To: AJAX Post Pagination in MooTools

Ever wanted to browse through the older post archives only to be staring at the screen for ages in frustration while the content slowly loads up? A quick fix would be to make use of AJAX to load the post archives. In this tutorial, I will show you how to do that using the ever popular JavaScript framework; MooTools on a typical 2 column WordPress theme.

The code is quite easily digestible and with a little CSS tweaking, you could get it to work for your theme.

Step 1: Readying the Files


Download the file mootools_core.js and upload it into a folder within the directory of your active WordPress theme and name it as js.

I only included components needed for AJAX post pagination in this MooTools build. You are however welcome to create a new one with components you would like to use.

Create a blank JavaScript file called ajax.js and upload into the js folder. At this point of time, you should have these 2 files in your js folder.

yoursite.com/wp-content/themes/yourtheme/js/mootools_core.js
yoursite.com/wp-content/themes/yourtheme/js/ajax.js

Step 2: Edit header.php


You have to tell the browser that we have a JavaScript file to use. Add the following lines into your header.php just before the closing </head> tag.
<?php wp_enqueue_script('mootools', '/wp-content/themes/
yourtheme/js/mootools_core.js')); ?>
<?php wp_enqueue_script('mootools_ajax', '/wp-content/themes/
yourtheme/js/ajax.js')); ?>



Where yourtheme is the folder name of your active theme.

Step 3: Edit index.php


When it comes to AJAX pagination, we only want to refresh the post listings and not other portions of your site. The following code snippets will tell the browser to skip loading the entire page.

The top of your index.php file should have the following line:

1 <?php get_header(); ?>


Replace that line with this:
1 <?php if (!isset($_GET['ajax'])) { ?>
2 <?php get_header(); ?>
3 <?php } ?>

Similarly, at the very end of the file you should see:
1 <?php get_sidebar(); ?>
2 <?php get_footer(); ?>

Now replace it with this:
1 <?php if (!isset($_GET['ajax'])) { ?>
2 <?php get_sidebar(); ?>
3 <?php get_footer(); ?>
4 <?php } ?>

To avoid any possible causes for the code not to work, do ensure that your HTML structure is identical to the ones listed below. We will need to have a div with an id of ‘post’ nesting the loop.

Go back into index.php and add the div before the start of the loop, i.e. the end result should look like this:

1 <div id="post">
2 <?php if (have_posts()) { ?>

Be sure to also close the <div> after the loop ends.
1 <?php } ?>
2 </div>

We also need a div with a CSS class of ‘page-navi’ to nest the post pagination links.

1 <div class="page-navi">
2 <?php next_posts_link('- Older Posts -') ?>
3 <?php previous_posts_link('- Newer Posts -') ?>
4 </div>

Step 4: Writing your ajax.js


This is basically what goes on in your JavaScript file. I have left some pretty detailed comments so that you know what the code is all about.

1 //function used to handle AJAX post pagination
2 function ajaxLinks(id, container) {
3
4 //looks for all instances of id
5 $$(id).each(function(ele) {
6 //what happens when the particular instance is clicked on
7 ele.addEvent('click', function(e) {
8 e = new Event(e).stop();
9 var alink = ele.getProperty('href');
10 var url = alink;
11
12 //construct the new URL with a parameter indicating how it should load the page (fully or a portion of it)
13 if (alink.indexOf('?') != -1) {
14 url += "&ajax=y";
15 } else {
16 url += "?ajax=y";
17 }
18
19 //this is where the magic happens
20 var ajaxLink = new Request.HTML({
21 onRequest: function() {}, //what happens during an ajax request is made
22 onSuccess: function() { //what happens when an ajax request is completed successfully
23 new Fx.Scroll(document.body, {'duration': 'long'}).start(0, 0); //scrolls to the top of the page once your content is loaded
24 ajaxLinks('.page-navi a', 'post'); //calls the function again so that it will add ajax post to your loaded page
25 },
26
27 onFailure: function() {}, //what happens when an ajax request fails
28 update: $(container) //#post which is your container gets updated
29 }).get(url);
30 });
31
32 });
33
34 }
35
36 //needed for the MooTools build to be executed
37 window.addEvent('domready', function(dom){
38 ajaxLinks('.page-navi a', 'post'); //Of course all of this will not be completed until you call the function to action
39 });

Step 5: Customization


As you would have probably guessed, this is a very basic implementation of how AJAX post pagination works in WordPress themes.

I would love to let your imagination run wild and see what you come up. Of course you need to have a little knowledge in CSS to get it done. Customizations can be made in the 3 different events (onRequest, onSuccess and onFailure).

If you have any doubts or feedback about this how-to. I would love to hear of it.


you can find original post here : http://www.problogdesign.com/wordpress/how-to-ajax-post-pagination-in-mootools/

Using jQuery to post an array to a ColdFusion Component

A reader sent in an interesting question today. He was trying to make use of jQuery to post an array of data to a ColdFusion component. His CFC was expecting, and demanding, an array argument. Whenever he fired off the request though he received an error from ColdFusion saying the argument was not a valid array. Let's look at an example of this so it is more clear.

I'll start off with my server side component. It handles the incredibly complex task of returning the size of an array.


ColdFISH is developed by Jason Delmore. Source code and license information available at coldfish.riaforge.org
<cfcomponent>

<cffunction name="handleArray" access="remote" returnType="numeric">
<cfargument name="data" type="array" required="true">
<cfreturn arrayLen(arguments.data)>
</cffunction>

</cfcomponent>

1 <cfcomponent>
2
3 <cffunction name="handleArray" access="remote" returnType="numeric">
4 <cfargument name="data" type="array" required="true">
5 <cfreturn arrayLen(arguments.data)>
6 </cffunction>
7
8 </cfcomponent>


On the client side, I'm going to use jQuery to post a static query to the CFC. This will happen immediately on document load so I don't have to bother clicking a button or anything fancy like that. (Note, in the code block below I've removed the html, head, and body tags since they are all empty. I do have a script block to load in the jQuery library though.)


ColdFISH is developed by Jason Delmore. Source code and license information available at coldfish.riaforge.org
$(document).ready(function() {
var mydata = [1,2,3,4,5,"Camden,Raymond"];
$.post("test.cfc", {method:"handleArray",data:mydata, returnFormat:"plain"},
function(res) {
alert($.trim(res));
})

})

1 $(document).ready(function() {
2 var mydata = [1,2,3,4,5,"Camden,Raymond"];
3 $.post("test.cfc", {method:"handleArray",data:mydata, returnFormat:"plain"}, function(res) {
4 alert($.trim(res));
5 })
6
7 })


So, what would you expect to happen here? Well first off, it is important to remember that we are talking about an HTTP post here. You cannot send a complex data type of POST. It must be encoded somehow. jQuery does indeed encode the data - just not how you would expect it. Upon running this and examining the POST data in my Chrome dev tools, I see the following:



Not what you expected, right? You can see why ColdFusion complained. The "Data" argument doesn't exist. Instead we have a lit of things named like Data, but not quite. You can try using toString on the array, but that doesn't correctly handle the comma in the data. So what to do?

What I recommended was converting the array to JSON. It always surprises me when I remember that jQuery can't produce JSON on its own, but there are plugins out there that will do it it for you. Because JSON is a string format though I thought I'd write up a quick function to generate the string for me. This function makes the assumption that are array only contains simple values of numbers and strings.


ColdFISH is developed by Jason Delmore. Source code and license information available at coldfish.riaforge.org
$(document).ready(function() {
var mydata = [1,2,3,4,5,"Camden,Raymond"];
var myds = serialize(mydata)
$.post("test.cfc", {method:"handleArray",data:mydata, returnFormat:"plain"},
function(res) {
alert($.trim(res));
})

function serialize(arr) {
var s = "[";
for(var i=0; i<arr.length; i++) {
if(typeof(arr[i]) == "string") s += '"' + arr[i] + '"'
else s += arr[i]
if(i+1 < arr.length) s += ","
}
s += "]"
return s
}
})

1 $(document).ready(function() {
2 var mydata = [1,2,3,4,5,"Camden,Raymond"];
3 var myds = serialize(mydata)
4 $.post("test.cfc", {method:"handleArray",data:mydata, returnFormat:"plain"}, function(res)
{
5 alert($.trim(res));
6 })
7
8 function serialize(arr) {
9 var s = "[";
10 for(var i=0; i<arr.length; i++) {
11 if(typeof(arr[i]) == "string") s += '"' + arr[i] + '"'
12 else s += arr[i]
13 if(i+1 < arr.length) s += ","
14 }
15 s += "]"
16 return s
17 }
18 })


As you can see, I wrote a serialize function to handle converting the array into a JSON-encoded array. This isn't the only change though. We still aren't sending an array to the CFC. It's a string. So I rewrote the CFC to handle it a bit better:


ColdFISH is developed by Jason Delmore. Source code and license information available at coldfish.riaforge.org
<cfcomponent>

<cffunction name="handleArray" access="remote" returnType="numeric">
<cfargument name="data" type="any" required="true">
<cfif isJSON(arguments.data)>
<cfset arguments.data = deserializeJSON(arguments.data)>
</cfif>
<cfreturn arrayLen(arguments.data)>
</cffunction>

</cfcomponent>

1 <cfcomponent>
2
3 <cffunction name="handleArray" access="remote" returnType="numeric">
4 <cfargument name="data" type="any" required="true">
5 <cfif isJSON(arguments.data)>
6 <cfset arguments.data = deserializeJSON(arguments.data)>
7 </cfif>
8 <cfreturn arrayLen(arguments.data)>
9 </cffunction>
10
11 </cfcomponent>


I normally don't like to "muck" up my code so that it has outside knowledge like this. However, I'd probably have a remote service component in front of this component anyway and the whole issue would become moot.

There are probably many better ways of handling this. Any suggestions?

you can find original post here : http://www.coldfusionjedi.com/index.cfm/2010/3/23/Using-jQuery-to-post-an-array-to-a-ColdFusion-Component

Wednesday, November 3, 2010

Setup a Blog Inside Your Magento Store!

Are you looking to integrate a blog into your Magento shopping cart ecommerce store? There are a few ways you can accomplish this, but one way that I have found is far better than any of the other possible solutions. It’s using Lazzymonks Blog Extension for Magento. Here are the specific steps for setting it up.
Setup Blog in Magento

First go to the extension page so that you can grab the extension key. If you have already tried installing the Lazzymonks WordPress Integration extension, than you have two steps you need to take first. 1) uninstall the wordpress integration extension from your Magento Extensions manager. 2) delete the blog_setup entry in the core_resource table of your database. (trust me on this one, these two steps definitely come from experience)

Now go to your Magento Extensions manager and input the extension key to install. This will install the extension package and put files in your directory. If you are not using the default template that comes with Magento install, than you need to copy the folder ‘blog’ from app/design/frontend/default/default/template/blog to where ever your theme is. Also do the same for blog.xml from app/design/frontend/default/default/layout. If you miss this step your blog will show up as a blank page.

You will now have a new menu item in your admin called Blog where you can create categories and blog posts, very similarly to how you would in WordPress. You will also want to go to System >> Configuration >> Blog - this is where you can customize the main blog settings such as layout etc.




If you by any chance run into any problems setting this up hit me back and I can help you out. If you follow my steps, each one step for step than you should not run into any problems at all.
What this can do for your store!

A simple easy to use blog inside your Magento ecommerce store can do wonders for your sales if you are delivering a good product. Think of all the good articles and reviews you could give on all your products each day, week, and month. One of the things many ecommerce websites lack is quality content to go along with their catalog. People seem to think that a catalog is good enough for content. This is simply not true. Adding a blog to your online store and writing quality content about your products and about your industry could increase your online sales dramatically.
Still Need a Theme For Your Magento Project? Check Out HelloThemes

HelloThemes has a very large selection of Premium Magento Themes covering all different types of needs from Electronic styled themes for Electronic stores to Furniture styled themes. Check them out before trying to mess with your own theme. It could save you a ton of time and money.




you can find original post here : http://chasesagum.com/setup-a-blog-inside-your-magento-store

Pulling Channel Partners into the Cloud

Many providers of cloud-based enterprise applications are finally able to answer the perpetual criticism that the software-as-a-service (SaaS) model does not support robust customization. Cloud-based development environments such as Salesforce.com’s Force.com and NetSuite’s SuiteCloud have matured significantly, offering things like scripting languages, user interfaces, and support for mobility. It’s reasonable to expect that these cloud-based development environments can now be used not only to customize SaaS applications but also to deliver significant extensions focused on manufacturing and other industries.

The question now is: Who’s going to develop cloud-based application extensions and customizations that support complex manufacturing-specific processes, such as product configuration, change order management, or even production scheduling?

Probably not manufacturers themselves. Manufacturers have spent the past 20 years or so dealing with the cost and hassle of customizing ERP applications and supporting those customizations. Today, most simply want to buy vertical-specific functionality off the shelf. And for the most part, vendors of on-premise ERP applications, such as SAP, Oracle, IFS, Infor, CDC Software, can accommodate them.

Nor are the SaaS application providers themselves likely to take on the task of delivering extensions required by manufacturers. SaaS vendors such as Salesforce.com and NetSuite, with their low cost-of-entry subscription pricing, are focused on mass markets. At a press event this week, NetSuite CEO Zach Nelson said his company wants to sell to the “Fortune 5 Million.” To do that, SaaS vendors must deliver broad solutions suitable for the mass market. They can’t spend time and resources digging into deep industry-specific functionality.

So, who does that leave? It leaves channel partners like resellers, independent software vendors, and system integrators. If multi-tenant SaaS applications are ever going to support complex manufacturing-centric processes, it’s the channel that’s going to deliver that support.

The problem is that resellers, system integrators, and other channel players typically haven’t been that interested in developing cloud-based solutions or creating practices around specific SaaS platforms. That’s because they can still make more money in the on-premise world, where remuneration comes mostly up front and isn’t spread out over years.

Now, however, there are signs that SaaS application vendors are reaching out to potential channel partners in a serious way. At its SuiteCloud 2010 event last week, NetSuite upgraded its development environment, making it easier for channel partners to control and distribute version changes of their software and execute phased rollouts.

That followed a move earlier this year by NetSuite to sweeten the deal for channel resellers by offering them 100% of subscription revenue for the first year when signing up new customers.

NetSuite’s efforts seem to be paying off. The company said Wipro will launch a practice around NetSuite, becoming the first significant system integrator to do so.

And NetSuite has attracted at least two manufacturing-focused ISVs: Rootstock Software, which claims to have six customers for its MRP application, developed on the SuiteCloud platform; and Iron Solutions, the maker of a cloud-based e-commerce solution for heavy equipment manufacturers.

This, to be sure, is a small beginning. But it’s good news for manufacturers who one day would like to see cloud-based applications deliver more than just sales force automation and accounting.



You can find the original post here : http://blog.managingautomation.com/channel/2010/04/pulling-channel-partners-into-the-cloud/

Pulling Channel Partners into the Cloud

Many providers of cloud-based enterprise applications are finally able to answer the perpetual criticism that the software-as-a-service (SaaS) model does not support robust customization. Cloud-based development environments such as Salesforce.com’s Force.com and NetSuite’s SuiteCloud have matured significantly, offering things like scripting languages, user interfaces, and support for mobility. It’s reasonable to expect that these cloud-based development environments can now be used not only to customize SaaS applications but also to deliver significant extensions focused on manufacturing and other industries.

The question now is: Who’s going to develop cloud-based application extensions and customizations that support complex manufacturing-specific processes, such as product configuration, change order management, or even production scheduling?

Probably not manufacturers themselves. Manufacturers have spent the past 20 years or so dealing with the cost and hassle of customizing ERP applications and supporting those customizations. Today, most simply want to buy vertical-specific functionality off the shelf. And for the most part, vendors of on-premise ERP applications, such as SAP, Oracle, IFS, Infor, CDC Software, can accommodate them.

Nor are the SaaS application providers themselves likely to take on the task of delivering extensions required by manufacturers. SaaS vendors such as Salesforce.com and NetSuite, with their low cost-of-entry subscription pricing, are focused on mass markets. At a press event this week, NetSuite CEO Zach Nelson said his company wants to sell to the “Fortune 5 Million.” To do that, SaaS vendors must deliver broad solutions suitable for the mass market. They can’t spend time and resources digging into deep industry-specific functionality.

So, who does that leave? It leaves channel partners like resellers, independent software vendors, and system integrators. If multi-tenant SaaS applications are ever going to support complex manufacturing-centric processes, it’s the channel that’s going to deliver that support.

The problem is that resellers, system integrators, and other channel players typically haven’t been that interested in developing cloud-based solutions or creating practices around specific SaaS platforms. That’s because they can still make more money in the on-premise world, where remuneration comes mostly up front and isn’t spread out over years.

Now, however, there are signs that SaaS application vendors are reaching out to potential channel partners in a serious way. At its SuiteCloud 2010 event last week, NetSuite upgraded its development environment, making it easier for channel partners to control and distribute version changes of their software and execute phased rollouts.

That followed a move earlier this year by NetSuite to sweeten the deal for channel resellers by offering them 100% of subscription revenue for the first year when signing up new customers.

NetSuite’s efforts seem to be paying off. The company said Wipro will launch a practice around NetSuite, becoming the first significant system integrator to do so.

And NetSuite has attracted at least two manufacturing-focused ISVs: Rootstock Software, which claims to have six customers for its MRP application, developed on the SuiteCloud platform; and Iron Solutions, the maker of a cloud-based e-commerce solution for heavy equipment manufacturers.

This, to be sure, is a small beginning. But it’s good news for manufacturers who one day would like to see cloud-based applications deliver more than just sales force automation and accounting.

Tuesday, November 2, 2010

Top 10 Joomla SEO tips for Google

How to search engine optimize your Joomla website in 10 easy steps.


1. Keyword Use in Title Tag


and appear in the blue bar of your browser.The number one factor in ranking a page on search engines is the title tag. These are the words in the source of a page in
Choose the title of an article very carefully. Joomla will use the title of the article in the title tag (what appears in the blue bar). It will also be the text used in any insite links (see #5 and 6)

2. Anchor Text of Inbound Link


Anchor text is the text that appear underlined and in blue (unless it’s been styled) for a link from one webpage to another.
Try to get some inbound links to your article using the keywords you want to be ranked for. Two ways are to do this are through online press services such as PRweb.com or simply by networking.

3. Global Link Popularity of Site (PageRank)


How many pages are linking to your page is called link popularity, or in Google, PageRank.
The more sites link to you, the better. Joomla is a CMS that helps you add content quickly. Create one quality content page per day. Quality content is the most important factor to getting bound links. For a site that will perform well, you eventually need 200 odd pages of content. This is the important point. QUICK SEO IS DEAD. The only way to perform well in SEO now is to have a rich content site.

4. Age of Site


When was the domain of the site registered?
Nothing you can do about this, but there is evidence that suggests that how long you have your domain registered for makes a difference (spam sites are not registered for long). Go and extend your domain registration for a couple of years.

5. Link Popularity within the Site


This is the number of links to the page from inside your own domain.
Because of #2, it’s critical that you link to articles from within your site using the right anchor text. Make sure that you:

  • Use the linked titles setting

  • Make good used of the Most Read, Related Items and Latest News modules.

  • Have a sitemap component linked to right from your homepage



6. Topical Relevance of Inbound Links and Popularity of Linking Site


It’s important that you get quality inbound links. This means they have to be from a site that is topically related to your, and one that has a high PageRank.


  • It’s worth submitting once to directories (then forget about it).

  • Type “related:www.yoursite.com” into google and contact the top 20 returns for links.Link Popularity of Site in Topic Community

  • Make sure you have a blog on your site, and network with others in your topical community. Make sure you frequently link to other blogs in your topical community.



7. Keyword Use in Body Text


The keyword density of the phrase you are optimizing for in the content of the page. Still important, the German study from Sistrix identified some interesting results.


  • Targeted keywords in the first and last paragraphs. There is a simple trick here, write your quality content, and then use the tool of your choice to find the keyword density. THEN, take the top three words and add them to the meta keywords in the parameters part of the page (in Joomla admin). This is somewhat backwards for some maybe, it optimizes a page for what you actually wrote, rather than trying to write a page optimized for certain words (which I always find difficult).

  • Keywords in H2-H6 headline tags seem to have an influence on the rankings while keywords in H1 headline tags seem to be getting less valuable. Modify the output of the core content component through a template override file.

  • Using keywords in bold or strong tags - slight effect, same with img alt tags and filenames.



Additional notes:


A couple of other factors at the bottom of measured/estimated influence.

8. File Size


The file size doesn’t seem to influence the ranking of a web page on Google although smaller sites tend to have slightly higher rankings. Optimize those images!

9. Clean URL (Joomla SEF)


Although Keywords in the file name (URL) don’t seem to have a positive effect (based on the German study), a URL with few parameters (?id=123, etc.) is important. Turn on Joomla SEF but don‘t get anal about it.

Other Notes


10. Utilize Your Error Pages.


Too often companies forget about error pages (such as 404 errors). Error pages should always re-direct "lost" users to valuable, text-based pages. Placing text links to major site pages is an excellent practice. Visit www.cnet.com/error for an example of a well-utilized error page. To make the error page fit with the rest of the theme of your site, create an uncategorized article and then copy the source as viewed on a webpage, and put that into the 404 file.



You can find original post here : http://www.compassdesigns.net/joomla-blog/top-10-joomla-seo-tips-for-google

Implementing Joomla’s Cache Callback API



Continuing with our movement to truly collaborate with as many Joomla developers as possible, we were sent a test copy of corePHP's WordPress MU for Joomla! For the unacquainted, this is a full version of WordPress INSIDE Joomla! for those who want the best blog software inside the most versatile CMS.


Installation
I tested on a local MAMP install and the install process was timing out, so I checked the filesize - 2.5mb :) That'll do it. This makes sense, since the component contains WordPress (duh). So I just unpacked in the tmp directory and installed from a directory and bam, the install worked in a snap.

At the end of the installed, you're instructed to move a couple of key files for WordPress to work properly. I wish there was a link, but that's just because I'm lazy. I like that it wasn't automated, because I like to know when files are being moved into these particular directories.

I think clicked the link to Start Blogging, and pow, I'm in the WordPress admin. I fully expected this, but it's still crazy at first. As a side note and time for a plug, using our AdminPraise2 Joomla! admin template with the WordPress inspired theme would really provide a seamless admin experience.



First Impressions
I clicked the link to preview my blog, and was taken to my Joomla! site, with a fully functional, fully integrated WordPress blog, and I'm severely impressed. I created a Joomla! menu item, where you can enter the ID of which blog you'd like displayed (since you can have multiple in WordPress MU), and there's my site's blog!

You can quickly add a new blog for a new user, as the Joomla! users are synced with WordPress.


Additional Extensions
corePHP gives you an entire slew of modules including tags, latest, and several others. They also included every native WordPress plugin you could possibly need. Lastly, they've included JomSocial plugins that will give your community the ability to have their own WordPress powered blog, sweet! I didn't get to test the JomSocial plugins, but I'm sure it shows latest blog activity on the users' JomSocial profile, and adds blog activity into the JomSocial activity stream.


Issues
I can't give a perfect review of anything, it's just not genuine (or in my nature). The only conflict I found was that the frontend layout for WordPress MU for Joomla! shares some common IDs used in Joomla! templates, such as "content" and "navigation". These are commonly used in any layout from any CMS, so I'm sure they're included as they would be in any WordPress layout, but to work well inside Joomla! they'd need names like "wp-content" and "wp-navigation". This would take 2 seconds for the end user to fix, so it's certainly not a show-stopper, but I could see a customer getting stumped and possibly confused for a moment.SuperBlog
With about 5 minutes of CSS love, I was able to get WordPress MU by corePHP working with our SuperBlog template. Not bad if I do say so myself.



Pricing
$79.95 for Single Blog, $99.95 for MU (as tested), $420.95 for 6 Site License


Rating

  • Best blog software inside the best CMS

  • Superb integration (frontend and backend)

  • All the extensions you'll need

  • Price point may be high for some

  • Possible (tiny) template conflicts



Update - New WordPress MU Quick Icons Admin Module!
We enjoyed our testing so much, we've gone ahead and created a new WP Icons admin module, with quicklinks to all the pages in the WordPress admin. Just like the AP Icons in AdminPraise2, it contains all the links with the ability to hide, or display to Manager & above, Administrator & above, or only Super Admins. Look for lots more of these types of modules from us soon!

Here's a couple of screenshots of WP Icons in AdminPraise2 (WordPress theme) and the default admin template:

Wednesday, October 27, 2010

iPhone application development and outsourcing

The complete potential of iPhone, the multimedia gadget, can be utilized by developing ingenious applications for it. The launch of the SDK (software development kit) by Apple in 2008 boosted iPhone application developer worldwide to come up with unique and customized applications for iPhone users.

The SDK, also known as the ‘tool chain’ includes:

Xcode: It is the integrated development environment (IDE), wherein iPhone applications are developed. It is the integral part of the iPhone application development kit and consists of a graphical debugger and a powerful source editor too. Interface builder: It helps in the designing and testing of user interfaces. The graphical editing environment of the interface builder is utilized by the iPhone application developer to design user interfaces and seamlessly integrate the applications to the 3G environment of iPhone. Instruments: The instrument retrieves data, analyzes and compares performance and displays the results graphically in real-time. It plays a pivotal role in the real-time optimization of iPhone applications.

An iPhone application developer should have a sound knowledge about using the SDK. The SDK uses the objective C language and runs only on the MAC OS X 1.5 platform (the OS of iPhone). The applications developed needs to be approved by Apple and can be distributed solely through App Store.

iPhone website development

There are various categories catering to which, iPhone applications are developed. Many companies specialize in a particular category. For example, a company may specialize in the domain of iPhone website development. An iPhone mobile development domain includes useful web 2.0 applications designed exclusively for iPhone, like:

Search tools. Web utilities. Social networking. Ecommerce websites. Travel, sports and entertainment and so on.

Outsourcing of iPhone application development

Outsource of iPhone application development has several advantages which mainly includes:

Firstly, it is cost effective. Outsourcing of IPhone applications gets the job done in lower costs. Customized applications can be developed without much investment which was otherwise necessary for the technical manpower and training. The rigmaroles of the iPhone applications- approval by Apple, guidelines etc are taken care by the companies.

iPhone Cool Projects

41gNRqHs6qL. SL160 iPhone application development and outsourcing

I am the webmaster at www.synapse.co.in – a iPhone website development company in India offering numerous services, such as flash web development, flash scripting, customized applications for the iPhone,and website maintenance services

ISBN13: 9781430223573
Condition: USED – Very Good

Notes: BUY WITH CONFIDENCE, Over one million books sold! 98% Positive feedback. Compare our books, prices and service to the competition. 100% Satisfaction Guaranteed

The iPhone and iPod touch have provided all software developers with a level playing field—developers working alone have the same access to consumers as multinational software publishers. Very cool indeed! To make your application stand out from the crowd, though, it has to have that something extra. You must learn the skills to take your apps from being App Store filler to download chart-topping blockbusters.

Developers with years of experience helped write this book. Spend some time understanding their code and why they took the approach they did. You will find the writing, illustrations, code, and sample applications second to none. No matter what type of application you are writing, you will find something in this book to help you make your app that little bit cooler.

The book opens with Wolfgang Ante, the developer behind the Frenzic puzzle game, showing how timers, animation, and intelligence are used to make game play engaging. It moves on to Rogue Amoeba’s Mike Ash explaining how to design a network protocol using UDP, and demonstrating its use in a peer-to-peer application—a topic not normally for the faint of heart, but explained here in a way that makes sense to mere mortals. Gary Bennett then covers the important task of multithreading. Multithreading can be used to keep the user interface responsive while working on other tasks in the background. Gary demonstrates how to do this and highlights traps to avoid along the way.

Next up, Canis Lupus (aka Matthew Rosenfeld) describes the development of the Keynote-controlling application Stage Hand, how the user interface has evolved, and the lessons he has learned from that experience. Benjamin Jackson then introduces two open source libraries: cocos2d, for 2D gaming; and Chipmunk, for rigid body physics (think “collisions”). He describes the development of Arcade Hockey, an air hockey game, and explains some of the code used for this.

Neil Mix of Pandora Radio reveals the science behind processing streaming audio. How do you debug what you can’t see? Neil guides you through the toughest challenges, sharing his experience of what works and what to watch out for when working with audio. Finally, Steven Peterson demonstrates a comprehensive integration of iPhone technologies. He weaves Core Location, networking, XML, XPath, and SQLite into a solid and very useful application.

Software development can be hard work. Introductory books lay the foundation, but it can be challenging to understand where to go next. This book shows some of the pieces that can be brought together to make complete, cool applications.



You can find original post here : http://www.theoutsourceblog.com/2010/08/iphone-application-development-and-outsourcing/

Virus Cleaning and Prevention in osCommerce

What virus do?
In an osCommerce site a virus malware do the following:
- Create a form ask unnecessary to fill confidential data like order
detail or paypal detail. Once someone fills these details, those details
will be emailed to third party for misused.
- Creating link of website for creating traffic for those link.
- Using redirector for redirecting the customer to other site.
- Using iframe to display some unauthorized detail with a link to wesite.
So basically there are two objectives
- Steal data.
- Divert traffic to another website.

Different ways in which hackings are achieved.
1) SQL injection
2) Modify .htaccess and writing the error 404 rules or rewrite rules.
3) Place some javascript.
4) Placing .php or other files to execute and modify other files.

How to make out what is wrong?
Downloade all the code and check for,
- External links.
- javascript code having eval in it.
- check your .htaccess file.
- checking image folder and other folder.
- check permission is 777 or writing permission.
- use some scanning references to check for valueval pass and
analyze code.
- check your error log.
- check your access log.

Preventions:

Prevention basically includes three things.
• Your site itself
• Password of software used to upload content on site.
• And your Computer from where content is being upload.

How does one protect its site?
- Ensuring that all third party scripts or tools used on site should have
latest security updates or should be asked to hosting company to do
so.
- Delete unwanted folders, files, scripts and services those are no
more in use.
- Occasionally change the password of the software used to upload
content to the site.
Use strong password.
- Appropriate file permissions to be given.
- Disable file manager from admin.
- Site admin should be password (.htaccess) protected.
- Keep the computer used for site’s upload and download activities
should be up-to-date with all necessary operating system updates
and a strong antivirus with all latest updates.

Recommendations:
- It is always recommended, not to keep a soft copy of site
access details on computer.
- A regular code and database backup should be taken.
- There is few security add-ons are available in osCommerce that
should be installed on the site.




You can find original post here : http://www.oscprofessionals.com/blog/

Configuring a mobile Drupal site and making your theme mobile friendly

In this tutorial I'll explain how to setup your Drupal site to be mobile friendly. Before you begin, it's helpful to consider the following: 1) which mobile devices to support; 2) using a different theme for mobile; 3) which hostnames will be used; 4) multi-site configuration options; and 5) any site alterations to simplify the mobile experience.

For my site, I decided on the following:
1) iPhones (to start)
2) Use a more clean theme for my mobile site: DevSeed's Singular
3) redirect mobile traffic to a new subdomain (mobile.thedrupalblog.com)
4) I was using sites/default for my blog, so hosting another hostname on the same filesystem was not an issue
5) I decided to remove some items in my primary navigation, and add some jQuery (accordion effect) to collapse node content on my front page.


I added the new DNS for my subdomain to point to the same IP address of my main site. In a standard Apache vhosts configuration, you can add a ServerAlias directive to ensure the mobile hostname is handled by the main site's vhost. For example:


ServerName thedrupalblog.com
ServerAlias mobile.thedrupalblog.com
DocumentRoot /var/www/vhosts/thedrupalblog.com/httpdocs


I installed and enabled the Singular theme in sites/all/theme/singular.

I added some mobile specific configurations in my settings.php file (sites/default/settings.php):



Now, if someone visits my site using an iPhone, the user would be redirected to my specified mobile address, AND a new mobile theme would be used!



In addition, I decided to make some alterations to my mobile theme to simplify the interface. I created a module and added a hook_preprocess_page() implementation:

$value) {
if (in_array(strtolower($value['href']), $href_remove)) {
unset($vars['primary_links'][$key]);
}
}
}

}
?>


By added the above module code and jQuery, I removed some items from my primary navigation and added an accordion-like interface for the front page:



NOTE: If you're using a Mac, the iPhone Simulator application (which comes with Xcode + iPhone SDK) is a great way to development and test mobile configurations.


You can find original post here : http://thedrupalblog.com/

Controlling Joomla! templates depending on menu you use

We have been working on building a multi-school Joomla! website and there have been many hurdles to overcome since Joomla! isn’t a multi-site CMS. After searching for anything that’s common across each school, I realized the menu for each school could be used as a common denominator. If I could set a variable depending on which menu is loaded on the page then I can do just about anything I want, such as change the logo, set a unique CSS class, etc. After meeting with our developers we figured out how to do this and I’ll share it with all of you in case you ever need to do the same.

To start we need to pull in the information of which menus are being used on the page. Place this code in the head of your template:

jimport('joomla.application.menu');
$menus = JSite::getMenu();
$m = $menus->getActive();


This checks for the active menus on the page and sets it into the variable $m so we can manipulate it. Next, you need to look at the menutype in the administrator Menu Manager (under Type). This is what we will use to determine which school is being viewed. I want to take this information and set a new variable named $school:

if($m->menutype == 'district-information') { $school = 'district-info'; }
if($m->menutype == 'cityname-elementary-school') { $school = 'cityname-elementary'; }
if($m->menutype == 'cityname-middle-school') { $school = 'cityname-middle'; }
if($m->menutype == 'cityname-high-school') { $school = 'cityname-high'; }


The first thing I did was echo the variable $school into an ID on the body tag.

body id="< ?php echo $school; ?>"

This allows me to style the CSS uniquely according to the school being viewed which gives you a lot of style control. To change the logo I can just call it by the school.

#cityname-high .logo {background: url(../images/logo-highschool.jpg);}

Another thing I did was load a unique module position according to the school so I could publish modules in all pages the menu shows up on. I did this by doing the following:

if($school == 'district-info') {
echo '';
};
if($school == 'cityname-elementary') {
echo '';
};
if($school == 'cityname-middle') {
echo '';
};
if($school == 'cityname-high') {
echo '';
};


As you can see this can give you the ability to do things you may not have been able to do before in Joomla!. I know it saved me from a major headache!




You can find original post here : http://www.corephp.com/blog/controlling-joomla-templates-depending-on-the-menu-you-use/

Tuesday, October 26, 2010

Reverse engineering an "encrypted" Joomla! plugin

On extensions.joomla.org a lot of extensions are offered worthwhile using. Most of them are released under the GNU/GPL and free to use, others are offered under a commercial license - I don't have a problem with this, because for me the functionality is more important than the amount of money I have to pay.

Open source or not

But one thing which I find very important is the openess of the PHP-code. If an extension is GPL-ed, the code is open source which enables me as programmer to fix problems myself instead of relying on other people to fix it for me. It allows me to solve problems much quicker.

With commercial extensions however there are two variations: Extensions which are both commercial as open source and extensions which are both commercial as closed source. I prefer the open source extensions, but sometimes you're just stuck with closed source because of the functionality it sometimes offers. I always cross my fingers and hope that I don't bump into a problem that makes me call some helpdesk-guy that doesn't understand a bit of the problem.
The website is down

Now something happened which made me reconsider my point on closed source extensions (for the worse): The website was down. I have full access to the webserver, so I logged in through SSH to have a look at the Apache error-log. There I quickly discovered the problem. Some kind of Joomla! plugin gave a huge timeout.

The Joomla! plugin in question was a system plugin and depended on a helper-file. Somehow this helper-file tried to reach another remote site. And because this other remote site was down as well, the Joomla! plugin was waiting indefinitely for a response and so did my website. No timeout was being given.

I quickly logged into the Joomla! Administrator, navigated to the Plugin Manager and disabled the plugin. Hmm, the website was still down. Appearently disabling the plugin from within the Joomla! backend did not actually disable the plugin itself. With a steady hand I removed the plugin-files. The website was up again.

Next, I deciced to do a full audit on this plugin. What was causing the problem? And more importantly, why did it not give a timeout when trying to reach the remote site - for a proper PHP-script it seemed to be the most responsible thing to do.
A first glance at the evil Joomla! plugin

At a first glance the plugin looked very cool. The backend did not have any parameters, it showed an HTML description which was displayed just as plain text and not as HTML, and the plugin title did not follow the plugin naming conventions ("System - My Plugin"). But sometimes I'm just too picky about these things. Instead of becoming to frustrated I had a look at the code instead.

The main PHP-script is written following the JPlugin-class standard, which is a clear way of writing your own plugin. However the main file included a helper file, even when the plugin itself was disabled. That was a big mistake: The helper tried to fetch content from a remote site, which was down. But it tried to do this, regardless of the state of the plugin. If the plugin would be disabled, there was no need for this action anyway.

...
include_once( dirname( __FILE__ ) . '/evil.helper.php' );
...
class plgSystemEvilplugin extends JPlugin
...


License and reverse engineering

The Joomla! plugin was mentioning (in the source code) a copyright but not a license. On the website I could not find any word of the license under which the source code was distributed. Now, Joomla! is released under the GNU/GPL and officially all extensions that extend Joomla! should fall under the GPL as well, and thus make it open source. I took the liberty to assume that the GPL was applied to this plugin as well and began reading the code.

The main plugin-file did not do much, except include a helper-file and call a function within this helper-file. So I opened up the helper-file to discover that it did not contain readable code but something encrypted instead. Looking at the GPL it is absolutely legal to decrypt this encrypted code, so I started decoding it.

BFkWUEZsfUtbXj9YS1gqIC47dzdzUBpQLQlaBQ9KUxgPA1g4XBZdGxwHUzZXVBYyFl1LZ
wkRSRlHDgddQx8KHQEODRUNGA4dXwRXBgBDGR9jF10UcgATSRxaVR1HS11BAVphQh0Wa2
lfFhgKHzxEAB0cTw0aBgMKFg1vHxYLHxcSRlxHTgBUEAJjQhocGgA7BFQCR1IKRwAH


Decrypting the base64 file

The file was encoded with base64 encryption, and any PHP-programmer should know that there is a PHP-function "base64_decode()" to help you with this. The difficult part was that after base64-decryption I ended up with again a base64 encryption. So again I decrypted it, but now I ended up with a base64-encryption locked with a specific encoding key.

$codelock_decrypter["t"] = base64_decode("LlJpagxUIiZLXmc3Ijk5PzUaMCM
pKStbKikEHm48W0FZISlwKSQiVz8jLidnTz8jXxdqGExfWTExNTg+NF4MdzstNRRtHUFe
LSJWTl0NYSYrOWY ...
...
$codelock_decrypter["z"] = substr($codelock_decrypter["license"], $codelock_decrypter["x"] % strlen($codelock_decrypter["license"]), 1);


This CodeLock-encryption is however not impossible to crack. Still this is not real closed source like ionCube or Zend Encryptor - it just takes some good knowledge of PHP to turn the base64-encryption into regular PHP-code. After 15 minutes of good hacking I succeeded in breaking the full encryption and store the PHP-code as a readable content.

There I found the root of the whole problem: The PHP-scripts themselves did not contain any logic by themselves. Instead the PHP-script was encrypted with CodeLock but still able to decrypt itself. After this a request was made to the remote site to get again encryted text which was then decrypted and then executed as PHP. And what was all the fuzz about? About 400 lines of code that I could write easily in one evening.
So what's bad about this Joomla! plugin

Though the functionality of the plugin was very useful, the downtime of the remote site brought a very weak architecture into the light. This is the list:

* The plugin is not clear on the license needed to redistribute the PHP-code. I assumed the code to be GPL, which is probably legally the right assumption.
* The plugin called a helper-file even when the plugin was disabled, which shows that the manufacturor did not actually test things properly.
* The code is encrypted with encryption software written in a manner, which I have seen only with script-kiddies. It contained PHP Notices and even PHP Warnings which were manually oppressed, which is a bad habit.
* The whole functionality depends fully on a remote site. If this site is down, the plugin doesn't work.
* They ask money for this type of software.


How could you know?

You don't, unless you are an experienced PHP-programmer like me. It takes a lot of knowledge to find out what an extension is doing exactly and things get even more complicated with things like base64-encryption. But you can now for sure that with closed source less people are making sure that the PHP-code is of a high quality.


You can find the original post here : http://blog.opensourcenetwork.eu/blog/programming/reverse-engineering-an-qencryptedq-joomla-plugin

Setting up a symfony project with PHPUnit on Hudson

We are using Hudson now for several months as Continous Integration System. This short article describes how we have configured a hudson project for a symfony 1.4 project. I am assuming that the reader is already used to Hudson and knows how a normal project has to be configured.

PHPUnit is the test framework of our choice (surprise, surprise) and we are using the sfPHPUnit2Plugin for all our projects. If you do not now this plugin you may first read another post where the usage and features are described in detail.

All requirements in short:

* Hudson has to be installed
* Hudson plugin xUnit Plugin has to be installed
* the symfony project needs the sfPHPUnit2Plugin
* PHPUnit has to be installed on your test server


Ok, here the configuration steps of the hudson project:

1. Configure your project

Configure standard settings for a hudson project like source-code management settings or email notifications. Please check the official docs if you do not know how to handle this.

2. Add a shell build step


Building a symfony project in a test environment is pretty easy. With the help of some shell commands the project is completely configured and ready for testing. Those shell commands may be entered in the build step section of the hudson project. Defining the correct commands is the main part during the configuration process.

Our configuration looks like this:

1. cd $WORKSPACE/trunk
2. sh _deployment/install_test.sh
3. php symfony cc
4. php symfony phpunit:test-all --configuration --options="--log-
5. junit=build/testresult_$BUILD_NUMBER.xml" cd build
6. ln -s -f testresult_$BUILD_NUMBER.xml currentTestResult.xml

1. Jump in the project root of the project
2. Install the project on the test server with the help of a internal shell script. This step includes for example the generation of the databases.yml.
3. Clear the symfony cache (always a good choice)
4. Run all PHPUnit tests including unit and functional tests. The test result is written in a jUnit compatible logfile (needed for the xUnit Plugin).
5. Jump in the build directory, which is internally used by Hudson
6. Symlink the latest testresult

3. Configure Post-Build-Action

After the xUnit Plugin is installed correctly, an additional PHPUnit Pattern field should be displayed in the post build action section. In this field has to be entered:

1 trunk/build/currentTestResult.xml

The options “Fail the build if test results were not updated this run” and “Delete temporary JUnit file” should be both checked.

The xUnit Plugin takes the currentTestResult.xml file, which was previously created with the help of the sfPHPUnit2Plugin and analyzes it. When everything works fine, you should be able to review the created test reports.

Here some screenshots how this result could look like.

Build history:


Trend graph of the test results:




You can find the original post here : http://dev.esl.eu/blog/category/symfony/