Google Site Speed Optimisation

I was just reading Matt Cutts blog and he reminded me about google’s site speed factors – which now (possibly) affecting page rank in search engines. This led me to jump into webmaster tools for my primary site: http://www.simple1300numbers.com.au and check its load time. Its 3.3 seconds – which according to google is in the top 50% of webpages. It also lists some suggestions on what I can do to improve it, mainly some compression to reduce transfer time.

Just a quick disclaimer – backup your site before doing this. I accidently deleted my css and javascript files a few times so just be safe. Also, make sure you correct your paths as needed.

I decided to do what I could to help the load time, so according to my own results in the Net report in firebug, it takes 4.1 seconds to load the front page.

I grabbed Page speed from the link above, here are the results and the points I address from the report:

80/100
1. Leverage browser caching
The following resources are missing a cache expiration. Resources that do not specify an expiration may not be cached by browsers. Specify an expiration at least one month in the future for resources that should be cached, and an expiration in the past for resources that should not be cached:
How to fix this? Enable caching. I use a Plesk 9.2 server, running Apache 2.2 so I found the following addition to my .htaccess

# Enable Caching
# 30 days (60*60*24*30) + 1 second
<FilesMatch "\.(js|css)$">
Header set Cache-Control "max-age=2592001, public"
</FilesMatch>

# 366 days (60*60*24*366) + 1 second
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|swf)$">
Header set Cache-Control "max-age=2592001, public"
</FilesMatch>

2. Enable compression, Combine external Javascript, Minify Javascript

I addressed all these points at once. The process is: concatenate javascript and css files, minify then compress (gzip) them before uploading to production server.

Now I make use of a little bash script for uploading my website from development to production, it currently looks like this:

src_path=/full/path/to/dev/site/
server_ip=example.com
username=example.com
password=secret
dst_path=/httpdocs/
lftp -c "set ftp:list-options -a;
open ftp://$username:$password@$server_ip;
lcd ./site;
cd $dst_path;
mirror --reverse --use-cache --verbose --allow-chown --allow-suid --no-umask --parallel=2 --exclude-glob .svn"

I then added to the start of the deploy script:

rm -f site/css/combined.css.gz; cp site/css/default.css site/css/combined.css; gzip -c site/css/combined.css > site/css/combined.css.gz;
rm -f site/javascript/combined.js.gz
cat site/javascript/flash.js site/javascript/default.js site/javascript/prototype.js site/javascript/ypslideoutmenus.js > site/javascript/pre_combined.js
php ~/dev/jsmin.php site/javascript/pre_combined.js > site/javascript/combined.js
gzip -c site/javascript/combined.js > site/javascript/combined.js.gz

Now the above script creates a few files for us, most importantly css/combined.css.gz and javascript/combined.js.gz. I now update my template for the website to use these files if we in production mode. Also, I run a local file which I adapted for my use: ~/dev/jsmin.php original is here, my custom version is here (rename it). Its the JSMin compressor – with a slight modification (last few lines) to run in cli mode.

<?php if(ENV == 'production'){ ?>
<link rel="stylesheet" href="./css/combined.css" type="text/css" media="screen"/>
<script type="text/javascript" src="./javascript/combined.js"></script>
<?php }else{ ?>
<link rel="stylesheet" href="./css/default.css" type="text/css" media="screen"/>
<script type="text/javascript" src="./javascript/flash.js"></script>
<script type="text/javascript" src="./javascript/default.js"></script>
<script type="text/javascript" src="./javascript/prototype.js"></script>
<script type="text/javascript" src="./javascript/ypslideoutmenus.js"></script>
<?php } ?>

You will notice we don’t reference the gzip file directly up there, we do this so that the browser will fall back to the non gzip version if it doesn’t support gzip compression. Now lastly, we need to enable the compression in apache – we do this through our .htaccess

# enable gzip for css & javascript
AddType text/css css gz
AddEncoding gzip gz
AddType text/javascript js gz

RewriteEngine on

RewriteCond %{HTTP:Accept-encoding} gzip
RewriteCond %{HTTP_USER_AGENT} !.*^Mozilla/4.* [OR]
RewriteCond %{HTTP_USER_AGENT} .*MSI?E.*
RewriteRule ^(.*).css $1.css.gz

RewriteCond %{HTTP:Accept-encoding} gzip
RewriteCond %{HTTP_USER_AGENT} !.*^Mozilla/4.* [OR]
RewriteCond %{HTTP_USER_AGENT} .*MSI?E.*
RewriteRule ^(.*).js $1.js.gz

AddOutputFilterByType DEFLATE text/html text/plain text/xml text/javascript text/css

# Netscape 4.x has some problems...
BrowserMatch ^Mozilla/4 no-gzip

# MSIE masquerades as Netscape, but it is fine
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html

# NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48
# the above regex won't work. You can use the following
# workaround to get the desired effect:
BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html

# Make sure proxies don't deliver the wrong content
#Header append Vary User-Agent env=!dont-vary

Now at this point, my Net tab in firebug tells me my load time is 2.81 seconds – I’m pretty happy with that! I will leave those changes in place and check out what webmaster is saying in a few days.

When I next get some time, I’ll check out wordpress and what plugins/hacks are available to boost its speed.

Comments