The Checkbot Page Speed Guide will teach you how to create high performance web pages that download in less time and start rendering more quickly. Website performance is a crucial factor for increasing conversions as fast pages keep visitors engaged and get them to stay on your site for longer. Search engines even treat page speed as a ranking signal because they know fast websites are important for a good user experience. To accelerate your site, we'll guide you on ways to reduce the amount of data that needs to be sent to the browser, how to take advantage of browser caching, techniques that make pages render more quickly and tips on avoiding redirects that can slow down browsing.

When you speed up service, people become more engaged - and when people become more engaged, they click and buy more.

Google, “The Google Gospel of Speed - Think with Google”

Page size

A key factor in making pages faster is to reduce the size of each page and their resources using compression and minification.

Next to eliminating unnecessary resource downloads, the best thing you can do to improve page-load speed is to minimize the overall download size by optimizing and compressing the remaining resources.

Google, “Optimizing Encoding and Transfer Size of Text-Based Assets”

Does your site follow SEO, speed & security best practices?  🤔  Our browser extension can check 100s of pages against 50+ page factors for you in a few clicks.  🎉  We're trusted by 80,000 active users and have a 4.9/5 rating.

Test your website now with Checkbot.

Use compression

Configure your server to send data in a compressed format to reduce transfer times. For compressible files, compression can reduce the amount of data that needs to be sent by around 70% for only a small amount of configuration effort. Text-based data formats such as HTML, CSS, JavaScript, plain text, XML, JSON and SVG should almost always be sent with compression enabled. However, we recommend only compressing responses above 1,000 bytes in size as compressing small files can actually increase the response size and compression has a server CPU overhead. To confirm your server is sending a URL response in a compressed format, you should 1) verify the Content-Encoding response header is being returned and 2) check the value of that header is set to the name of a compression scheme such as gzip, deflate or br. If you’re checking from your own machine that runs antivirus software, be aware there’s been cases where HTTP scanning features have been found to disable compression before responses reach your browser. Also, it isn’t unusual to see misconfigured servers which compress some compressible file types like HTML but forget to do it for others like CSS so be on the look out for this.

Avoid recompressing data

Compression should only be applied to data that can be compressed to avoid wasting server resources. Common non-text-based data formats like JPG, PNG, MP4 and PDF files are already stored in a compressed format so you don’t need to compress files like these again when your website serves them. Trying to compress already compressed data will consume server resources and can even result in increased file sizes. Your server should be configured to only compress resources that can be effectively compressed such as HTML, CSS, JavaScript and SVG files. To check for this, you need to confirm the Content-Encoding response header is used to specify a compression scheme only for compressible files. The header should be omitted for already compressed files.

Use minification

Minify CSS and JavaScript files to reduce page weight. Minification works by removing or transforming the content of these files in a way that preserves the behaviour of the code but reduces the file size. For example, comments and whitespace are easy candidates for removal. CSS and JavaScript file sizes can typically be reduced by around 30% using minification. Although compression offers greater file size reductions, minification combined with compression will result in even smaller files. This is because minification can remove data from files whereas compression must preserve all the data.

Avoid inline source maps

Take care that your minification process does not inline source maps into your JavaScript or CSS files. Source maps are used to help developers debug minified files by providing a mapping from minified code statements back to the original unminified code. Minification tools will either store the source map in an external file or inline the source map into the minified file itself. As source map data can take up even more space than the minified code, inline source maps defeat the file shrinking purpose of minification. Inline source maps should only be used during development and you should use external source maps for production builds. You can check if a JavaScript or CSS file contains an inlined source map by looking for a statement starting with /* sourceMappingURL=data: in your files.

Caching

Caching should be used to decrease server load and reduce the amount of data browsers need to download while browsing your site.

Fetching something over the network is both slow and expensive. Large responses require many roundtrips between the client and server, which delays when they are available and when the browser can process them, and also incurs data costs for the visitor. As a result, the ability to cache and reuse previously fetched resources is a critical aspect of optimizing for performance.

Google, “HTTP Caching”

Use caching

Enable caching of your page resources so browsers can reuse resources that have already been downloaded. The Cache-Control response header is used to specify the caching policy of each URL: the no-store setting prevents all browser caching and the no-cache setting forces the browser to check with the server if a cached version is out-of-date before using the cached copy. For page resources, we recommend that no-cache and no-store are not set. This configuration means a browser will always cache resources and will immediately reuse them if needed without having to contact the server again. Warning: You must have a strategy for how updated versions of your page resources are going to replace the cached versions. Generally, each updated resource should use a new URL to force browsers to fetch the new version. This is referred to as “cache busting” and is built into many web frameworks.

Use long caching times

Configure page resources to have long caching times so browser caches will retain them for longer. The cache duration of each resource URL can be specified by either 1) setting an Expires response header which specifies the point in time the response becomes stale such as Expires: Fri, 10 Aug 2019 20:00:00 GMT or 2) adding a max-age directive to the Cache-Control response header that specifies the number of seconds the response is valid for such as Cache-Control: max-age=3600 for 1 hour. If max-age and Expires are both used, max-age takes priority. We recommend setting the cache time of page resources to at least 24 hours.

Avoid duplicate resources

The same page resource should always be served from the same URL to improve caching efficiency. Browsers will only reuse a cached resource if the resource is requested from the exact same URL as before. If the same resource is available over multiple URLs, this can lead to extra browser requests. For example, say a browser cached a resource from http://example.com/jq.js and later had to request identical content from these URLs while browsing (where the URL differences are highlighted): https ://example.com/jq.js, http:// www. example.com/jq.js, http://example.com/ jquery .js, http://example.com/ libs/ jq.js, http://example.com/jq.js ?v=3.2&a=1, http://example.com/jq.js ?a=1&v=3.2. As none of the URLs are exact matches, the same content would need to be downloaded when each URL was first seen as a cached response from a different URL cannot be used instead. For caching to happen, URLs must have matching protocols, filenames, folders, query parameters and even query parameter ordering. Avoid caching issues like this by making sure each unique page resource is referenced using a single consistent URL.

CSS

CSS delivery should be optimised by avoiding inline CSS and avoiding the use of @import.

Before the browser can render content it must process all the style and layout information for the current page. As a result, the browser will block rendering until external stylesheets are downloaded and processed, which may require multiple roundtrips and delay the time to first render.

Google, “Optimize CSS Delivery”

Does your site follow SEO, speed & security best practices?  🤔  Our browser extension can check 100s of pages against 50+ page factors for you in a few clicks.  🎉  We're trusted by 80,000 active users and have a 4.9/5 rating.

Test your website now with Checkbot.

Avoid excessive inline CSS

Prefer CSS in external files over inlining large amounts of CSS into pages to improve caching efficiency. CSS can be inlined using style attributes and <style> tags. This can be convenient during development but any CSS that is shared between pages won’t be cached by the browser. However, Google does promote inlining just enough CSS at the start of each page so the top portion of the page renders immediately so browsing feels faster. Google’s AMP project for mobile also promotes inlining where each page should have all the CSS it needs inlined to reduce requests as long as only a modest amount of CSS is involved. We recommend you keep the amount of inlined CSS per page under 50,000 bytes.

Avoid CSS @import

Avoid using @import in CSS files as this prevents parallel loading of CSS. Inside a CSS file, @import can be used to include the contents of another CSS file by specifying a URL. This can be convenient but impacts download times because browsers can only start fetching the imported URL after the CSS file containing the @import has been fetched. To get a page to load CSS files in parallel, you should instead add a <link rel="stylesheet" href="…"> tag for each of the CSS files you need to load to the HTML code of the page.

Javascript

Take care not to block page rendering when you need to include JavaScript in pages.

Before the browser can render a page it has to build the DOM tree by parsing the HTML markup. During this process, whenever the parser encounters a script it has to stop and execute it before it can continue parsing the HTML. In the case of an external script the parser is also forced to wait for the resource to download, which may incur one or more network roundtrips and delay the time to first render of the page.

Google, “Remove Render-Blocking JavaScript”

Avoid render-blocking JavaScript

External JavaScript should be included on pages in a way that doesn’t block page rendering. A <script src="…"> tag will block HTML rendering until the JavaScript file specified is fetched and the contents of the file has finished executing. Inline JavaScript also blocks rendering until execution is complete. You can stop <script> tags from blocking rendering by placing them directly before the closing </body> tag. Alternatively, for external JavaScript files you can load the script in the background using either 1) <script defer src="…">, which delays script execution until the DOM is ready or 2) <script async src="…">, which will execute the script as soon as it has loaded. Note that defer scripts execute in the order they appear on the page like inline scripts. However, async scripts execute whenever they have downloaded so their execution order can change. This difference is important when scripts have dependencies.

Avoid excessive inline JavaScript

Avoid inline JavaScript to improve page rendering times and caching efficiency. JavaScript code can be inlined directly into pages with the <script> tag which can be convenient during development but comes with the downsides that 1) inline Javascript will block HTML rendering until the JavaScript code is parsed and executed and 2) if the code is shared between pages it won’t be cached. You should instead include JavaScript as external files with <script src="…"> tags in a way that defers loading. However, inlining small scripts can be benefitial when avoiding an extra request is more important for performance than caching. We recommend each page should not exceed more than 10,000 bytes of inline JavaScript.

Redirects

Following redirects can significantly slow down network requests so you should avoid using page and resource URLs that trigger redirects.

Redirects trigger an additional HTTP request-response cycle and delay page rendering. In the best case, each redirect will add a single roundtrip (HTTP request-response), and in the worst it may result in multiple additional roundtrips to perform the DNS lookup, TCP handshake, and TLS negotiation in addition to the additional HTTP request-response cycle. As a result, you should minimize use of redirects to improve site performance.

Google, “Avoid Landing Page Redirects”

Does your site follow SEO, speed & security best practices?  🤔  Our browser extension can check 100s of pages against 50+ page factors for you in a few clicks.  🎉  We're trusted by 80,000 active users and have a 4.9/5 rating.

Test your website now with Checkbot.

To speed up browsing between pages on your site, avoid hyperlinks to URLs that perform redirects. For example, say you had a hyperlink pointing to /news which now redirects to /updates because the page was moved. A user visiting the hyperlink would experience a significant page loading delay as the redirect was being followed. You can avoid delays like this by hyperlinking directly to the redirect destination.

Avoid resource redirects

Avoid loading page resources via URLs that perform redirects as redirects will slow down page loading. For example, if a CSS file was loaded using the URL example.com/styles.css that then redirects to www.example.com/styles.css, the redirect would introduce a delay when the file was fetched. This can be fixed by linking directly to the redirect destination.

Avoid redirect chains

When a redirect must be used, use a single redirect instead of a chain of several redirects for a faster response. For instance, a redirect chain such as http://example.comhttps://example.comhttps://www.example.com can be common when redirect rules haven’t been optimised. Chains of redirects add significant delays to fetching URLs while browsing and search bots may give up following long chains which can impact search rankings. You should optimise your server redirection rules to eliminate chains of redirects when it isn’t possible to eliminate redirects entirely.