Here you can keep up with the latest release of Checkbot for Chrome and find out about recent new features, improvements and bug fixes. For infromation about getting the lastest version and checking which version you’re currently running, see the FAQ page. Please fill out a bug report if you think you’ve found a bug and feel free to contact us if you have any feature requests.
Version 0.4.0 (14 May 2018)
- New"Set canonical URLs" report added which checks all pages have valid canonical URLs set. This will catch a variety of issues such as if you accidentally declare more than one canonical URL per page, if you incorrectly use relative URLs and if the canonical URL points to a non-canonical page.
- ImprovedThe "Use compression" report will now only recommend you compress URL responses that are larger than 1,000 bytes as compressing small files can actually increase the response size (thanks Ryan!).
- ImprovedFont size and spacing tweaks so more URLs are visible at once when viewing reports.
- ImprovedSped up table cell rendering to reduce lag when scrolling quickly.
- ImprovedThe duplicate page content and canonical URL reports have been moved together into a new subcategory called "Duplicate content".
- ImprovedAdded a sidebar link to a contact form for sending general comments and questions.
- ImprovedThe crawl timer now shows the number of hours elapsed so the timer doesn't look like it's resetting to zero when one hour is reached.
- ImprovedAdded "URLs" label to "Explore" menu item for clarity.
- FixedFix for the "Use HSTS preload" report incorrectly flagging 31,536,000 seconds as too low for the "max-age" setting. Thanks Tim!
- FixedWhen a URL response includes several headers with the same name, the values of all those headers will now be shown when the "view headers" shortcut is used.
- FixedWhen you refresh a URL in a report table that's not in the main "URL" column, that row should still be visible after the refresh is complete.
Version 0.3.3 (26 April 2018)
- FixedFix for crawls sometimes failing if the start URL redirects. Thanks Marijo!
Version 0.3.2 (12 April 2018)
- FixedFix for the CSS validator taking a long time to check non-CSS files that are labelled as CSS files.
Version 0.3.1 (5 April 2018)
- ImprovedCleaned up consistency of hover and select effects for menu items.
- FixedDisabled the browser spell checker on URL form fields.
- FixedFix for cached responses sometimes being used when you ask Checkbot to refresh a URL.
Version 0.3.0 (14 March 2018)
- NewBulk export: You can now export all reports in one click via a sidebar button. This generates a ZIP file containing a CSV file for each report.
- NewPages that have a meta refresh URL are now viewed as redirects instead of pages. This is more in line with how they'd be treated by humans and search bots. This reduces false positives in several SEO reports and improves reports that take redirects into account.
- NewAdded a new "Broken URLs" report to the "Explore" category. This combines the URLs from the 4xx, 5xx and connection error reports. A shortcut to this report has been added to the summary screen so you can quickly view all your broken URLs in one place.
- NewRedirect chains that are too long are now flagged as URLs that fail to load.
- NewThe CSS validator now understand CSS variables, CSS grid and more vendor specific selectors.
- NewIf there's a problem fetching the start page when starting a new crawl, there's now a link you can use to quickly ask for help.
- ImprovedThere's now specific error feedback if you try to start a crawl with a URL that contains invalid characters.
- ImprovedWhen you search for text, the search feature now only considers the visible text and not the text hidden inside cells that represent lists of items.
- ImprovedThe "/page-not-found" URLs that are used to check for 404 responses should now be hidden from SEO reports where you wouldn't care about them.
- ImprovedAdded a warning if you try to close the Checkbot tab when a crawl is in progress.
- ImprovedAdded a note in the "Use compression" report to be aware that antivirus on your own machine that scans HTTP traffic has been known to disable HTTP compression before responses reach your browser.
- ImprovedClarified in the "Avoid duplicate resources" report text that cached responses cannot be used if URLs differ by protocol or query parameter order.
- ImprovedUpdated several of the report names for consistency and clarity.
- ImprovedTimestamps in exported filenames are now more human readable e.g. "2018-03-12 18.31.05".
- FixedOn Windows, you should no longer get a false warning you're about to close the Checkbot tab when you export files.
- FixedThe "Avoid duplicate pages" report was taking the page title into account and preventing duplicates from being detected.
- FixedFor reports that find duplicates, each row should show a sample of the duplicates found up to a maximum of 10. Before, the upper limit wasn't always being used.
- FixedThe CSS validator error messages no longer mention the line number twice.
- FixedMeta refresh URLs were sometimes being parsed incorrectly to include extra quotes and spaces.
- FixedFixed a problem that caused the "Use 404 code for broken links" report to be blank when you refreshed URLs.
- FixedFix for redirect path lengths sometimes being incorrect when several redirect paths overlapped each other.
Version 0.2.0 (27 Feb 2018)
- NewIn the crawl settings, you can now provide a list of regex patterns to tell Checkbot which URLs to ignore during crawls.
- NewAdded distinct icons with tooltips for indexable, canonicalised and "noindex" pages to help you tell them apart.
- NewThe sidebar now contains a shortcut for going to the FAQ page and for sending a bug report.
- ImprovedURLs from "rel=canonical" tags are now crawled.
- ImprovedAdded more advice on how to find a solution if a site fails the "Return 404 for broken links" test.
- ImprovedIncreased length of page "description" columns so long descriptions aren't clipped to make it easier to audit your text.
- ImprovedUpdated the button labels and tooltips to make it clearer that you can 1) "refresh" the report results for a single URL only and 2) "clear & recrawl" to wipe the current results and perform the same crawl again from scratch.
- ImprovedAdded a hint to the "Use HSTS preload" report that you should take care to capitalise "includeSubDomains" correctly when setting HSTS headers. Thanks to Vlad for pointing this out via the first ever bug report form submission (your email address didn't work!).
- ImprovedAdded some spacing around the "restore defaults" and "new crawl" buttons to reduce accidental clicks.
- OptimizedImproved the speed of switching between reports with many rows.
- FixedRelaxed the minimum browser window size restrictions as these were stopping some users from seeing the "export" button.
- FixedTable columns will now automatically resize when you resize the browser window.
- Fixed"20x", "30x", "40x" and "50x" reports renamed to "2xx", "3xx", "4xx" and "5xx".
- FixedFixed the table row alignment in the "Summary" view.
Version 0.1.48 (21 Feb 2018)
- NewAdded a link to the release notes from the start page that changes colour when a new version has been installed.
- ImprovedWhenever you close the Checkbot tab, the Checkbot extension restarts itself to force downloaded updates to be applied to make sure you've got the latest version.
- ImprovedThe "Hide server version data" report now ignores the "Server" header if a version number is not seen in the header value. This stops values like "Netlify" and "Cloudflare" which are too vague to matter being flagged.
- ImprovedThe headers flagged by the "Hide server version data" report are now shown in a list instead of being displayed in a single row.
- ImprovedThe help text for "Use compression" has been improved to make it clearer which URLs require compression.
Version 0.1.47 (14 Feb 2018)
- ImprovedReports with zero results have their result count dimmed to improve readability.
- OptimizedLarge tables should now scroll more smoothly.
- FixedExport button and dashboard score should be fully visible on smaller screens.
- FixedUsing the enter key when a crawl is in progress should not start a new crawl.
Version 0.1.46 (8 Feb 2018)
- ImprovedIncreased maximum recommended length of page meta descriptions for "Use optimal length descriptions" report.
- ImprovedCSV export filenames now include the report name, website hostname and a timestamp.
Version 0.1.44 (5 Feb 2018)
- ImprovedThe URL of the previously open tab is only used as the crawl URL on startup if that URL is not a Chrome Web Store page. This behaviour is less confusing to users who just installed the extension and started it on the Store page.
Version 0.1.43 (5 Feb 2018)
- ImprovedExpanded the help text for reports in the "explore" category.
- FixedExtension would sometimes fail to load if you were offline during start up.
Version 0.1.39 (2 Feb 2018)
- New"Indexing" section added to the "explore" category which includes reports for "Indexable pages", "Non-indexable pages", "Pages without canonical tag", "Pages with canonical tag", "Canonicalised pages" and "Self-canonical pages".
- ImprovedDuplicate title, page and description reports now only compare indexable internal pages instead of all internal pages.
- FixedLayout tweaks to support smaller screen sizes.