It's no revelation that having a fast loading website has a significant impact on your bounce rate, session duration and consequentially your conversion rate. But exactly how much of an impact it can have, has always been a mystery.

There is a strong correlation between site speed and revenue. If your site speed is >5 seconds, you should be worried.

Walmart found that for every 1 second improvement in page load time, conversions increased by 2%.

Large ecommerce website such as Amazon and Walmart have dedicated teams to monitor and improve site speed continuously. They have well-defined performance budgets that help keep the site speed in check.

What defines the performance budget?

  • Total page size
  • Number of HTTP requests
  • Max Image & JS size
  • First Contentful Paint (FCP) - When the first piece of content is painted on the browser. It could be text, image, video or canvas.
  • Time to Interactive (TTI) - The moment when website becomes interactive and usable.

While page speed is often a measure of how fast the homepage is, it doesn't really take into account the user journey from landing on the homepage to entering payment information and completing the purchase. Each page type should have its own benchmarks. The checkout for example, will almost always be faster than the homepage since it has almost no images and very few scripts and styling. Comparing the page speed of those two page types would be just silly.

Set benchmarks for each of these page types:

  • Homepage
  • Category Page
  • Product Page
  • Login Page
  • Blog Pages
  • Shopping Cart
  • Checkout

A study in 2017 by Akamai found out that a 100ms delay in site speed can drop your conversion rate by 7 percent and a 2 second delay can increase bounce rates by 103 percent.

User experience and site performance lye at the forefront of conversion rate optimization. The longer you can get users to stay on your website, the more likelier they are to make a purchase.

Google has found that by slowing its search results by 100ms, they could lose 8 million searches per day and millions of dollars in ad revenue.

There are some great tools out there to measure your site speed, here are some of my favorites -

Let's look at some of the most common issues plaguing your site speed:

Compress Images: This is probably the lowest hanging fruit to optimizing your website. Run your website through GTMetrix to find out if you have any uncompressed images on your website. The tool will show you how much you will save using compressed images and also generate compressed versions of your images for you to download. For future use, run your images through Tinypng or Compressor before uploading.

Note: It's best to stick to JPEG or WebP to get the smallest file sizes without quality loss

Minification: Another simple and quick fix for shaving off a few seconds. Minifying your HTML, CSS & JavaScript removes any unnecessary white spaces, new line characters and comments to reduce the file size without affecting the functionality. If GTMetrix detects that you have any unminified files on your page, it will generate minified version that you can download and replace on your server.

Typically this process should be automated by your developer, there's plenty of libraries out there to minify code on the fly. This process can reduce file size anywhere from 10%-95% depending on how the code was written.

Content Delivery Network(CDN): The closer a user is to the server, the lower the latency and hence faster data transmission. CDNs distribute your data across multiple server in several regions around the globe so that it's able to serve data to the user from the nearest possible location. They also work as a failsafe in case one of the server goes offline, data can then be fetched from any of the other servers.

Lazyload: To improve perceived load time and enhance user experience you can defer loading images below the fold by using lazyload. If your website has a significant number of images then implementing lazyload will have a significant impact especially on slower connections and it could mean the difference between a user staying or bouncing off.

You're going to need a developer to implement this since it requires a bit of coding to modify the markup. By far the best lazyload plugin I've used is Lazysizes by Alexander Farkas. It is also recommended by Google and includes some cool features such as intrinsic ratio container to preserve dimensions, transparent/low quality/blurry placeholder images and lazyloading background images.

Lazysizes is also SEO compliant, it does not hide images from Google. It detects if the user agent(Googlebot) is able to scroll. If not, then it reveals all the images to Googlebot allowing it to crawl and index them.

Gzip Compression: If you see this flagged by GTMetrix then it needs to be right at the top of your optimization list. Quite simply, Gzip compression algorithm finds and replaces duplicate data fragments thereby reducing the total file size of your HTML, CSS, and JavaScript. Enabling Gzip compression can reduce file size by up to 90%.

Gzip needs to be enabled on your webserver by editing the .htaccess(Apache) or config(Nginx) file.

Browser Caching: Each time the browser loads a page it has to request files such as Images, CSS, JavaScript etc from the server. Some of these files don't need to be loaded each time and can be reused for future sessions. This is where browser caching comes in, it stores files on users' local computer for future use thereby reducing the page load time of subsequent sessions.

Your logo is the best example to consider. It's most likely to never change, hence you can define all SVG files to be cached for one year. Similarly, you can define all other static assets and have them cached to not only save megabytes of data but also reduce HTTP requests. Each time a file is requested from the server, an HTTP request is made and the more requests sent to the server, the longer it takes to load the page.

In order to define cachable assets, you need to modify .htaccess on your server which usually resides in the root folder. Edit the file using any text editor and define what type of files the browser should cache and for how long.

<IfModule mod_expires.c>
  ExpiresActive On

  # Images
  ExpiresByType image/jpeg "access plus 1 year"
  ExpiresByType image/gif "access plus 1 year"
  ExpiresByType image/png "access plus 1 year"
  ExpiresByType image/webp "access plus 1 year"
  ExpiresByType image/svg+xml "access plus 1 year"
  ExpiresByType image/x-icon "access plus 1 year"

  # Video
  ExpiresByType video/mp4 "access plus 1 year"
  ExpiresByType video/mpeg "access plus 1 year"

  # CSS, JavaScript
  ExpiresByType text/css "access plus 1 month"
  ExpiresByType text/javascript "access plus 1 month"
  ExpiresByType application/javascript "access plus 1 month"

  # Others
  ExpiresByType application/pdf "access plus 1 month"
  ExpiresByType application/x-shockwave-flash "access plus 1 month"

Once you're done editing, save the file as is without any extension and upload to the server overwriting the existing file.

Note: Make sure to keep a backup of the original file before you overwrite just in case something goes wrong.

Defer/Async JavaScript: JavaScript is a parser blocking resource, which means if the parser encounters a script tag while building the DOM, it will stop and wait until the JavaScript is executed. Since DOM construction happens before render, all scripts will be loaded and executed before you see anything in your browser. The reason why the parser does this is that scripts can manipulate the page and the browser cannot know what that change would be, so it just assumes worst case scenario and stops everything until all scripts are executed.

To overcome this issue, we can either async or defer the JavaScript.

Async - File gets downloaded asynchronously without blocking the DOM construction and then execute as soon as downloading is complete.

Defer - File gets downloaded asynchronously without blocking the DOM construction but executed only after parsing is completed.

It's always best to use async however in some cases you might not be able to since it might conflict and break the code. In such cases you may defer the script. Also to optimize for FCP, move all non critical inline CSS right above the </body> tag.

Tip: Combine JavaScript files that are needed to render the page and defer the rest.

All in all, there's a myriad of factors which could affect your site speed. You need to identify and prioritize them; there are some that can be implemented right away and other's which require a bit of code change. Once you make those changes and reach the sub 3 second mark, setup speed test monitoring with GTMetrix or Pingdom to run tests every couple of hours or days and notify you of any fluctuations.

Besides the technical optimizations, there are also a couple of psychological alternatives which can be applied to reduce perceived load time.

A great example of this is the Houston airport baggage claim problem. Passengers were constantly lodging complaints about long wait times at baggage claim at Houston airport, so airport executives came up with an ingenious idea to move the arrival gates away from the main terminal and routed bags to the outermost carousel thereby increasing the duration to walk to baggage claim by six times. All of a sudden, complaints dropped to near zero. They didn't really solve the problem, it still took the same time for airport staff to process the bags and move them onto conveyor belts, they just reduced the "perceived" wait time.

Similarly, perceived wait times can also be reduced on websites by implementing animated loading icons or optimizing the critical rendering path so that the content above the fold is prioritized. Anything is better than a blank white screen, isn't it?

Google has tons of resources on how to optimize your website and reduce load times. As of last year they have taken a mobile first approach and even made mobile page load speed a ranking factor in Google search results.

User's now expect the same level of performance from mobile websites as they do from native mobile apps. Pinterest did a complete overhaul of their mobile website and reduced perceived wait time by 40%, resulting in a 15% increase in conversion to sign up.

Google did a massive study on this by training a deep neural network and feeding it tons of bounce rate and conversion data and found out that as page load time goes from 1 second to 10 seconds the probability of mobile user bouncing off goes up 123%. Think about all the money left on the table.

Amazon calculated that a page load slowdown of just one second could cost it $1.6 billion in sales each year

Let's do some calculations of our own and use Google's impact calculator to put things into perspective.

We're going to assume 100,000 monthly users on our website with a 1% conversion rate and $200 average order value. That put's our monthly revenue at $200,000. So what kind of impact would site speed have on my revenue in such a scenario? According to Google, A LOT.

It's pretty self evident but allow me to divulge..

By not optimizing our website we are losing $419,032 annually! Ouch.

Site speed is not a "technical" metric, it is a business metric that has a direct correlation with your bottom line. As such, site speed should be a KPI on every ecommerce and digital marketing manager/directors dashboard and requisite budget and resources have to be assigned towards optimizing it.