Audit your Sites Performance

By James Steel on

4 minutes | Kirby | Performance | Slate

Site Performance

The Problem

With the increasing demands of mobile users, site speed is more important then ever. We see a lot of companies tackling this after the sites been completed.

Indeed, we even provide what we call an MOT service, which means we produce an audit of your current site and rectify as much as possible for you. If you would like us to do this for you, get in touch.

Trouble is, this is like trying to catch the horse after it's bolted. Surely there is a better way. It should really be part of your build process, and automated as much as possible. Let's explore how to go about this. We love Kirby CMS over here, but the ideas described in this post will work with most CMS systems, even if the method will differ a little.

The Solution

First things first, we need to know what to tackle. There are some obvious suspects like minifying code and optimising images (more on these later), but numbers is what we need, and numbers we shall get. We need to audit the site occasionally during the build, and the best tool for this is

It will crawl your site, page by page, and measure its performance. Crucially, it also provides recommendations on how to improve things. Let's assume you have installed it globally and your ready to make it part of your process.

NPM scripts are a really powerful way to add tools to your project without the whole Gulp/Grunt shooting match (we can feel another blog post coming on … ).

Let's add a config block to your package.json. The lets you store variables to save you typing things over and over. Add it above your script block like this, and give two variable values that match your local and live domains:

"config": {
  "sitespeedtest": "http://hashandsalt.salt/",
  "sitespeedlive": ""
"scripts" {

Now we can add the commande into the scripts section to actually run the command:

"scripts": {
  "speeddev": " $npm_package_config_sitespeedtest",
  "speedlive": " $npm_package_config_sitespeedlive"

If you run that command, you will notice that it only analyses the home page. What gives? We need some extra parameters: $npm_package_config_sitespeedtest -d 3

That will crawl from the home page to a link depth of three. We still have a problem, though. If it's a large site, particularly one with a blog using tags and categories, it can take a very long time to complete the crawl and test each page.

We can fix this by passing Sitespeed a list of pages to crawl via a text file. Create two files in your project route, urlstest.txt and urlslive.text. Then you can feed this into the commands like this: $npm_package_config_sitespeedtest ./urlstest.txt $npm_package_config_sitespeedlive ./urlslive.txt

You can then specify the pages it should consider within those text files by placing a URL on each line. For example, my urlstest.txt looks like this:


You will find the reports under a sitespeed-result folder in your project root. Take time to review the resulting report. There is a lot of information in there, but the most useful stuff is under the Coach tab that be found by clicking Pages along the top, and drilling into the detail for a page. It will give advice on the things it finds, and the english is a little strange in places, but try to take care of the things it highlights.

This can vary from site to site, but we will go over the biggest wins here. It's also a good idea to test against your live site as well, as it's probably running on different technology and different settings are possible on live servers. Your aiming for a score 90 - 100 in each section of Sitespeed.

You can also tweak the Sitespeed command so that it simulates a slow connection and/or a mobile device. It can run other browsers too, besides the default of Chrome. For details on how to do this, read the docs.


Before tackling the items reported by Sitespeed, it is probably worth doing a little house keeping on your code base. The problem with SASS is it allows you to nest rules and this leads to getting a bit carried away.

The more you nest things, the longer the compiled rules will end up being. Take some time to reduce the nesting as much as possible. This will save a few kilobytes all by its self.