Welcome to the Checkbot Web Developer Guide, a collection of over 50 web best practices that will make your site rank higher in search results, be more secure against attackers and load faster in browsers. These best practices are based on recommendations from web experts such as Google, Mozilla, W3C, OWASP and Yahoo which we've split into the categories SEO (Search Engine Optimisation), website speed and website security. For each best practice, we explain why it's important, how to follow it and provide you with links for further reading. Select one of the guides below to get started or browse the complete table of contents which lists all the best practices together.

The guides

Table of contents

All the best practices contained in our guides are shown below.

Checkbot for Chrome can crawl your site to automatically check you're following
the 50+ web best practices from our SEO, security and speed guides.

Learn more

SEO Guide

The Checkbot SEO Guide covers everything you need to know to optimise the on-page SEO of your website. By following this guide, your pages will rank higher in search results, users will be more likely to click your links in search listings and visitors will get more out of your page content. We’ll cover topics such as how to use HTML to structure your pages in a machine readable way, best practices for writing human readable URLs and guidelines on how to configure your site to be search crawler friendly.

Page titles

Every page on your site should be given a concise, informative and unique title to improve your search rank and search result click rates.

Page descriptions

Every page on your site should be given an informative, concise and unique description.

Page content

Pages should contain substantial, unique and high-quality content that works well on mobile devices and has accessibility in mind.

URL names

Each page should have a well-written URL that is short, accurate and friendly for humans to read.

Code validation

HTML, CSS and JavaScript files should be valid to avoid issues that may impact search engines and visitors.
Your site should be free of broken links and configured to signal broken links to crawlers using a 404 response status code.

Robots.txt

Every subdomain on your site should have a robots.txt file that links to a sitemap and describes any crawler restrictions.

Redirects

Redirects are used to signal the URL for a page has changed. These should be used carefully as redirects can influence page rank.

Checkbot for Chrome can crawl your site to automatically check you're following
the 50+ web best practices from our SEO, security and speed guides.

Learn more

Web Speed Guide

The Checkbot Web Speed Guide will teach you how to create high performance web pages that download in less time and start rendering more quickly. Website performance is a crucial factor for increasing conversions as fast pages keep visitors engaged and get them to stay on your site for longer. Search engines even treat page speed as a ranking signal because they know fast websites are important for a good user experience. To accelerate your site, we’ll guide you on ways to reduce the amount of data that needs to be sent to the browser, how to take advantage of browser caching, techniques that make pages render more quickly and tips on avoiding redirects that can slow down browsing.

Page size

A key factor in making pages faster is to reduce the size of each page and their resources using compression and minification.

Caching

Caching should be used to decrease server load and reduce the amount of data browsers need to download while browsing your site.

CSS

CSS delivery should be optimised by avoiding inline CSS and the use of `@import`.

Javascript

Take care not to block page rendering when you need to include JavaScript in pages.

Redirects

Following redirects can significantly slow down network requests so you should avoid using page and resource URLs that trigger redirects.

Checkbot for Chrome can crawl your site to automatically check you're following
the 50+ web best practices from our SEO, security and speed guides.

Learn more

Web Security Guide

The Checkbot Web Security Guide will teach you how to harden the security of your website to reduce attack vectors, protect user privacy and prevent data leaks. Website security is critical whether you handle payments or not as succesful attacks can still damage your brand and lead to your site visitors being exploited. Browser vendors know that security is important to users as well because browsers are starting to show prominent warnings when insecure pages are detected. Search engines are also trying to encourage website owners to secure their sites by treating HTTPS as a ranking signal. This guide will teach you how to secure your website by setting up HTTPS and securing password forms as well as how to harden your site against common exploits known as XSS, clickjacking and content sniffing attacks.

HTTPS

HTTPS prevents attackers from reading and modifying data sent between your site and browsers. HTTPS should be considered a minimum security requirement for all websites.

HSTS

HTTP Strict Transport Security (HSTS) is a response header that improves security by instructing browsers to always use HTTPS instead of HTTP when visiting your site.

Content sniffing

A content sniffing attack typically involve tricking a browser into executing a script that is disguised as another file type. These attacks can be protected against with correctly configured response headers.

General

Response headers should be configured to restrict iframe usage, prevent XSS exploits and to hide server configuration data.

Checkbot for Chrome can crawl your site to automatically check you're following
the 50+ web best practices from our SEO, security and speed guides.

Learn more