Sitemaps are a great tool to help your site’s SEO. A Sitemap is just a list of URL’s for your site that search engines may consider when indexing your website. Without a Sitemap search engines have to organically discover pages in your site by following links. With a Sitemap you get an advantage by presenting the list of pages you believe are important for indexing. There is no guarantee that Google or other search engines will actually index each page in your Sitemap, but without a Sitemap are you just relying on simple discovery. A great resource to start with is Google Webmaster on Sitemaps.

On a recent project for a recipe site using Codeigniter (, I wanted to create a Sitemap that would be dynamically updated every day as recipes are added, deleted or modified. My plan was:

  1. Write a library class and method to regenerate the sitemap.xml file with the current list of links along with the last modified date.
  2. Call this method from the command line interface (CLI) as needed.
  3. Schedule a cron job to run this command daily. I considered updating the sitemap “live” as recipes were added or updated, but I didn’t want the additional page submit overhead on committing each recipe. Plus, a daily update is more than enough for most search engines.
  4. Finally submit the updated Sitemap to Google.

Most of the plan went well, but I hit a snag attempting to generate the sitemap from the command line on the host for our website, Hostmonster. That was the hardest part to debug, but read on for the solution. You can see the generated file here:

The Sitemap

The goal is to create an XML file that adheres to the Sitemap Protocol syntax. The resulting file should look like this:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="">





For small websites with less than 50,000 links (and results in a file less than 50MB in size) a single Sitemap file is fine. If your site is larger, then you can break this into multiple Sitemaps with a Sitemap Index file to reference the separate Sitemaps. For most of us, this is not an issue. I’ve also included last modified dates where possible; there are other properties you might consider including as well.

The Custom Library

This library class contains all the code to generate the Sitemap. We could have simply put this all in a controller method, but it seemed cleaner to keep this as a custom library. Change the file prefix from MY_ to the custom prefix you set in $config['subclass_prefix'].

<?php if ( ! defined('BASEPATH')) exit('No direct script access allowed');

 *	Sitemap Class
 *	Copyright (c) 2008 - 2013 All Rights Reserved.
 *	Props to Mike's Imagination for the approach
 *	Generates sitemap

class MY_sitemap {

    // CI instance property
	protected $ci;

	 *	Constructor
	public function __construct()
    	// Get the CI instance by reference to make the CI superobject available in this library
		$this->ci =& get_instance();

	 *	Generate sitemap
	public function create()
		// Begin assembling the sitemap starting with the header
		$sitemap = "<\x3Fxml version=\"1.0\" encoding=\"UTF-8\"\x3F>\n<urlset xmlns=\"\">\n";

		// Add static pages not in database to sitemap
		// Home page
		$sitemap .= "\t<url>\n\t\t<loc>" . site_url() . "</loc>\n\t</url>\n\n";
		// About page
		$sitemap .= "\t<url>\n\t\t<loc>" . site_url('main/about') . "</loc>\n\t</url>\n\n";

		 // Get all recipes (records) from database. Load (or autoload) the model
		$recipes = $this->ci->recipe_model->find_where();

		// Add each recipe URL to the sitemap while enclosing the URL in the XML <url> tags
		// Since my database tracks the last updated date, I am including that as well - but with the date only in YYYY-MM-DD format
		foreach($recipes['results'] as $recipe)
		   $sitemap .= "\t<url>\n\t\t<loc>" . site_url('recipe/show/' . $recipe->get_nice_url()) . "</loc>\n";
		   $sitemap .= "\t\t<lastmod>" . date('Y-m-y' ,strtotime($recipe->updated_date)) . "</lastmod>\n \t</url>\n\n";

		// If you have other records you wish to include, get those and continue to append URL's to the sitemap.

		// Close with the footer
		$sitemap .= "</urlset>\n";

		// Write the sitemap string to file. Make sure you have permissions to write to this file.
		$file = fopen('sitemap.xml', 'w');
		fwrite($file, $sitemap);

		// If this is the production instance, attempt to update Google with the new sitemap.
		// (The instance is set in the index.php file)
		if(ENVIRONMENT === 'production')
			// Ping Google via http request with the encoded sitemap URL
			$sitemap_url = site_url('sitemap.xml');
			$google_url = "".urlencode($sitemap_url);

			$ch = curl_init();
			curl_setopt($ch, CURLOPT_CONNECTTIMEOUT,2);
			curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
			curl_setopt ($ch, CURLOPT_URL, $google_url);
			$response = curl_exec($ch);
			$http_status = curl_getinfo($ch, CURLINFO_HTTP_CODE);

			// Log error if update fails
			if (substr($http_status, 0, 1) != 2)
				log_message('error', 'Ping Google with updated sitemap failed. Status: ' . $http_status);
				log_message('error', '    ' . $google_url);


// End of file MY_sitemap
// Location: ./application/libraries/MY_sitemap.php

The Controller

We are going to invoke the create() method from the command line using the CLI feature in Codeigniter. However the Codeigniter CLI utility will accept arguments as controller + method + parameters; so we need a simple controller method to call the library. Because we don’t want to refresh the Sitemap if someone stumbles on to this URL, we can add a check to see if this was called from the CLI or not with the $this->input->is_cli_request() function. Add something like this to an appropriate controller class:

 *	Updates Sitemap.xml when called from the command line. Not available via URL
public function generate_sitemap()
	// If not a command line request
	if( ! $this->input->is_cli_request())
		// 404 error or maybe just redirect somewhere else

Running the Sitemap Generator

After moving all the code to production you will want to test this out on your hosting server. According to the Codigniter CLI documentation, the command should be:

$ php index.php controller method [arguments]

So in our case the command should be:

$ php index.php [your controller] generate_sitemap

But when I ran this statement, the application just returned the HTML from the default controller to my shell! For some reason it was not accepting the [your controller] generate_sitemap as arguments, and was simply invoking the index.php file. I was stumped. After much Googling, I found a clue:

[~]# php -v
PHP 5.2.17 (cgi-fcgi) (built: Oct 29 2012 18:51:17)

The default PHP binaries were complied as CGI-FCGI, not as the Command Line Interface that I was seeking. Instead, I needed to reference the PHP CLI binaries in /ramdisk/bin/php5-cli:

 [~]# /ramdisk/bin/php5-cli -v
PHP 5.2.17 (cli) (built: Oct 29 2012 18:51:22)

That was it, now the proper command to run this from the command line is:

/ramdisk/bin/php5-cli ~/public_html/sitefolder/index.php main generate_sitemap

And it worked! A sitemap.xml file was generated in the root of the website public folder. To schedule the cron job I simply used the utility in the cPanel, although you could also use the command line, too. I scheduled it to run once a day at night. I checked the application log file after the first run to make sure there was no error in submitting the updated Sitemap to Google.

Publishing Your Sitemap.xml

Now that we have a Sitemap, you need to tell the search engines where to look. You can do this with your robots.txt file. If you don’t have a robots.txt file, then just create one in your site root and add this line (or add the line to your existing robots.txt):


That’s all. If you have a Google Webmasters account, then go there and view your site Optimization > Sitemaps. There you can test whether Google can view your sitemap. Google also has a sitemap validator, which would be a good thing to check.

Although it’s too early to tell if we are getting any better SEO, I do know from Google Webmaster data that Google has now indexed 20 more pages than before (after just one day). Well worth the effort.

In a previous post I had discussed how to densify a sales report by filling in missing date rows. I accomplished that by referencing an existing table of calendar dates to which I outer joined my sales data. For those who don’t have a handy dates table (as I do in our E-Business Suite), I provided a script to create such a table.

But probably many of you may not be able to (or are allowed to) create a database object just for a report. But, if you noodle around with Oracle long enough you’ll eventually learn nifty alternative ways of doing just about everything.

Using Oracle’s CONNECT BY LEVEL clause we can generate as many rows as we want in pure SQL. And these rows can be interpreted as dates, or months, or any interval we wish. Try this little row generating statement:

select level
from dual
connect by level &lt;= 10;

Although DUAL only has one row, we can generate as many rows as we need, ten in this case. Let’s change these rows of numbers to rows of dates:

select to_date('01-JAN-09','DD-MON-RR') + level - 1 cal_date
from dual
connect by level between to_date('31-JAN-09','DD-MON-RR') - to_date('01-JAN-09','DD-MON-RR') + 1;

Wrap this in an inline view and you have your virtual dates table! You could also make each row be month or quarter or year increments with some judicious use of temporal functions in Oracle.

If  you are using bind variables to specify your date range in your query,  you can then simply use the begin date and end date bind variable in your row generating query as well as in your main data query.

select to_date('01-JAN-09','DD-MON-RR') + level - 1 cal_date
from dual
connect by level <= to_date(:begin_date,'DD-MON-RR') - to_date(:end_date,'DD-MON-RR') + 1;

However, just for fun let’s just reference the date range in our row generating query by creating a named sub-query using Oracle’s WITH AS syntax:

with dates as (select to_date('01-JAN-09','DD-MON-RR') beg_date, to_date('31-JAN-09','DD-MON-RR') end_date from dual)
select dates.beg_date + level - 1 cal_date
from dual
cross join dates
connect by level <= dates.end_date - dates.beg_date + 1;

The CROSS JOIN creates a Cartesian product, but the named sub-query and DUAL only contain one row each, so this is not a problem. Almost there, now let’s just add in the original sales query and move our row generator into an inline view:

with dates as (select to_date('01-JAN-09','DD-MON-RR') beg_date, to_date('31-JAN-09','DD-MON-RR') end_date from dual)
select d.cal_date, nvl(s.total_sales,0) total_sales
from (
      select dates.beg_date + level - 1 cal_date
      from dual
      cross join dates
      connect by level &lt;= dates.end_date - dates.beg_date + 1
      ) d
left outer join (
                  SELECT TRUNC (sales_date) sales_date,
                          SUM (quantity * price) total_sales
                     FROM sparse_sales
                    WHERE sales_date BETWEEN dates.beg_date AND dates.end_date +.99999
                  GROUP BY TRUNC (sales_date)
                ) s on s.sales_date = d.cal_date
order by 1;

In most cases if you can rely on pure SQL you’re much better off. I like this row generating technique better than relying on a physical table of dates, as it’s more portable.

The Classic and Interactive report regions in Oracle’s Application Express have plenty of options to format the data to suit your taste – if you stick with a basic tabular layout. In most cases, that’s perfect, after all SQL is meant to dish up structured data. But, sometimes the tabular layout option does not do justice to certain columns. For example, records with larger text columns used to capture user notes along with a bit of meta about that record (status, date, who etc.) is not displayed quite so elegantly in Apex, in particular if you have a lot of columns.

Consider this table:


After inserting some rows, and creating a basic Apex Classic report, the report may look something like this:

HTML Classic Report

Not bad, but if you have a lot of meta columns the most important column Text will get scrunched up and take up a lot of vertical room. Instead, why not display the meta in a row along the top of each entry, with the text as a sub row spanning several columns?

Inspecting the page source of the last column using Firebug you can see that Apex simply wraps each column in a pair of table data tags <td></td> (If you don’t use the Mozilla Firefox add-on Firebug as a development tool, I say stop and go get it right now!)

HTML Report Table Structure

If we edit the report SQL source from this:

select id,
from notes

And inject a little extra HTML concatenated to our last column:

select id,
	'&lt;/td&gt;&lt;tr&gt;&lt;td&gt;&lt;/td&gt;&lt;td colspan="3"&gt;'||text||'&lt;/td&gt;&lt;/tr&gt;&lt;td&gt;' text
from notes

We can force the text column to it’s own sub-row.

Here’s what’s happening:

  1. The first </td> tag closes the Apex opening table data tag.
  2. The next <tr> tag opens a new row.
  3. Then the next pair of table data tags <td></td> are used to fill in the first column under the Edit link (just for a clean look).
  4. Then we open a new table data tag <td colspan="3"> for our text column with a colspan="3" so it lays nicely across the entire report row.
  5. Then we close our column table data tag and this extra row </td></tr>.
  6. Finally, we open an extra table data tag <td>. Why? Because Apex will try to automatically close the original Text column into which we just inserted our custom HTML. This open tag will match the closing tag and keeps the HTML valid.

One last detail, we are effectively removing the Text column as a column, so change the column title to null to hide it. Now let’s see how it looks:

Apex HTML Report Modified Layout

Hey, that’s way more readable. We can take this one step further, and conditionally set the background color for each Text row based on the status. By using the DECODE() function and adding a custom class to each row:

select id,
 '</td><tr><td></td><td class="'||decode(status,'Open','opennote','Closed','closednote')||'" colspan="3">'||text||'</td></tr><td>' text
 from notes

And add some simple CSS to the page header:

<style type="text/css">
.opennote {background-color: #FFF4B5;}
.closednote {background-color: #A6D5F3;}

Now check out the report:

Improved Apex HTML Report Layout

I expect this technique will not work with an Interactive Report, but sometimes a plain old Classic region is all you need.

Edit: You may need to adjust the Report Attributes > Layout and Pagination options to suit your theme or particular needs. This technique should work with most Report Templates, but the most reliable template is the default HTML option; the other templates may affect the look and feel. Also, you may need to set the option to Strip HTML to No. For the above example I had this set to Yes and it still worked, but you might have better success setting this to No. Finally, this example was created using Apex, I have not tried this in 4.0.

Report Settings

In our home workshop, we have a shared PC workstation running XAMPP to serve up some local PHP applications inside of our home network. These apps are mostly convenience tools, nothing critical, but at the same time the MySQL database holds some data we want to backup – just in case. Backing up the MySQL database is easy with mysqldump, but having a copy of the data on the same machine really wasn’t sufficient, it needed to be pushed off site to another machine. So, my goal was to:

  1. Run mysqldump to get an export of the database.
  2. Load the resulting dump file to an off site server.
  3. Schedule this to happen every night.

The solution was to write a batch file, and use PSCP to upload the file to our off site storage. The mysqldump.exe is a very handy tool to create a backup file of the database. You should read up on the many options, which I won’t go into here. One option you should definitely consider is to compress the dump output file. In my case, the output file is less than 50kb, so I didn’t bother.

XAMPP is really a great development tool, and a decent micro web server for home use. It comes with PHP 5.3 and MySQL already installed, plus more goodies. Installing it is a snap, normally as simple as unzipping it into the root of C:\ (Windows of course, but there are versions for Mac OS X and Linux). But this post is not about using XAMPP, but about backing up a database.

First, create a temp directory somewhere, I created one as C:\temp. Next, if you don’t already have it, install PSCP, an SCP client (command line secure file copy). We’ll use this to copy the files to a remote Linux server (such as a web host). You can get PSCP from the PuTTY download page (look for pscp.exe). The install is simple, just save the pscp.exe executable to a new folder in Program Files, I put it where I already have PuTTY installed:C:\Program Files\Putty\pscp.exe. If you are running a 64bit machine, you’ll probably put it in C:\Program Files (x86)\Putty. You’ll need to have SSH access to your remote server to use PSCP. If you don’t, contact your remote host administrator and ask for permission.

On your off site server create a directory to hold the backup file. Depending on your host (we’re using a web host, and placing the folder outside of the public_html directory), you can use PuTTY and SSH in, or just use the C-Panel to login and create the new directory. You’ll definitely want to place this outside of the web root!

Now, create a batch file. I placed this at the root of the application I wanted to backup, but it doesn’t matter where it goes (well, mostly doesn’t matter). Another good place might be in the c:\temp directory. If you haven’t done this before, create a new text file, and change the name to backup_db.bat. In this batch file, we’ll add a few commands:

REM Backup database using mysqldump.exe.
REM Then send the backup file to remote web host using pscp for offsite storage.

REM To run this script, you need:
REM 1) To have pscp.exe installed
REM 2) Assumes you are using MySQL as part of an XAMPP install
REM 3) A temp directory c:\temp
REM 4) A target directory on your web host for the backup file storage

REM Set local variables here, this way you can easily update references in one place.
SET localdir=c:\temp
SET mysqldumppath=C:\xampp\mysql\bin\mysqldump.exe
SET mysqluser=root
SET mysqlpassword=<your_root_password>
SET mysqldatabasename=<your_database_name>

REM Uncomment the line below if you want to accumulate a new database backup file each night.
REM SET backupfilename=backup_%DATE:~10,4%%DATE:~4,2%%DATE:~7,2%.sql

REM Otherwise, this version of the backupfilename just gets over written each night.
REM (Comment out the line below if you use date-unique backup file names.)
SET backupfilename=db_backup.sql

REM If the path has spaces, use double quotes for the path.
SET pscppath="C:\Program Files\Putty\pscp.exe"
SET remotehost=<your_remote_server>.com
SET remoteuser=<ssh_user_name>
SET remotepassword=<ssh_user_password>
SET remotepath=<remote_absolute_path>

@REM Dump database.
%mysqldumppath% -u%mysqluser% -p%mysqlpassword% --opt --databases %mysqldatabasename% > %localdir%\%backupfilename%

@REM Upload new file using pscp to remote server
%pscppath% -pw %remotepassword% %localdir%\%backupfilename% %remoteuser%@%remotehost%:%remotepath%

You’ll need to replace any <your_name_references> with your actual names and paths. If the string has a space in it, wrap it in double quotes. You can also download a copy of this batch file here: Backup MySQL Database 1.

After making your reference changes, test the batch file by double clicking it, you should see a command line window briefly appear as PSCP connects to the remote server and uploads the file. You can verify this by checking that your C:\temp directory has a copy of the database dump file, and then check that your remote server has the same copy of the dump file. Almost there!

Finally, we can schedule this as a Windows task. Open Start > All Programs > Accessories > System Tools > Task Scheduler. Depending on which version of Windows you are using,  you’ll see a slightly different Task Scheduler.

Creating a new task is fairly simple, give the task a name, select something to do, and set a repeating schedule, such as every night. On an XP machine, you’ll need to select an application to run. Just choose any application; after you create the task you can then edit the task and change the path to the full path and file name of your new batch file: c:\temp\backup_db.bat. On Windows 7 the wizard appears to let you just define a file to run.

That’s about it.

New: Version 1.3.3 Released September 22, 2014

There are many nice jQuery image gallery plugins available today. Yet I was still looking for the one with the right combination of features, specifically one that supported a generous caption text/HTML block for each image. The MG Gallery jQuery Plugin has just that and:

  • Navigable thumbnails.
  • A nice crossfade image transition, or a sequential swap transition.
  • An optional title that does not cover or obscure the current image.
  • An optional but built in opacity setting for the thumbnails with hover over animation.
  • Minimal CSS, HTML, or classes.
  • Flexible thumbnail and image page arrangement.
  • All images gracefully fade in on page load.
  • The thumbnails should degrade nicely if the client browser has JavaScript disabled.
  • An optional slideshow with controls.
  • Image preloader for more responsive animation
  • And a generous optional text/HTML block below the current image for more verbose captions.

Examples and Documentation

Download the Plugin at GitHub

MG Gallery Plugin

If you are not familiar with GitHub, just click the ZIP download button to get the most current version. If you are familiar with GitHub, feel free to fork it or improve it!

Using the MG Gallery Plugin

After linking to the plugin file in your header, the required  HTML structure is very simple: just a div with an unordered list of thumbnail images wrapped in anchors to the larger image, and a single empty div to contain the current image. I specifically wanted the plugin to require the fewest HTML objects, classes, or ID’s  in the page source. The documentation has all the details on usage, so I won’t repeat myself here.

I have tested the plugin in Firefox, Safari, Chrome, and IE7, 8, 9, 10, 11.

If you have questions or suggestions (or even if you just use the plugin), please let me know!