Posted 22 January, 2015 by in Screaming Frog SEO Spider

10 Features In The SEO Spider You Should Really Know

I wanted to write a quick post about some of the features in our Screaming Frog SEO Spider tool, that are a little hidden away and are quite often missed, even by experienced users.

Some of these are actually my favourite features in the tool and have saved me countless hours, so hopefully some of these will help you, too.

1) List Mode Is Unlimited

Some of this post will include featues only available with a SEO Spider licence, however, did you know ‘list mode’ in the free lite version, isn’t limited to 500 URLs like regular ‘spider’ mode? It’s actually unlimited.

list mode

So you can upload 10k URLs and crawl them, without the need to buy a licence.

Update – Please note, this has changed since this post was published in 2015. List mode is now limited to 500 URLs, without a licence.

2) Auditing Redirects In A Migration

This is by some distance my personal favourite feature due to the amount of time it has saved.

I used to find it a pain to audit redirects in a site (and, or domain) migration, checking to ensure a client had set-up permanent 301 redirects from their old URLs, to the correct new destination.

Hence, we specifically built a feature which allows you to upload a list of your old URLs, crawl them and follow any redirect chains (with the ‘always follow redirects’ tickbox checked), until the final target URL is reached (with a no response, 2XX, 4XX or 5XX etc) and map them out in a single report to view.

This report does not just include URLs which have redirect chains, it includes every URL in the original upload & the response in a single export, alongside the number of redirects in a chain or whether there are any redirect loops.

Click on the tiny incomprehensible image below to view a larger version of the redirect mapping report, which might make more sense (yes, I set-up some silly redirects to show how it works!) –

redirect chains report export

You can read more about this feature in our ‘How to audit redirects in a site migration‘ guide.

3) The Crawl Path Report

We often get asked how the SEO Spider discovered a URL, or how to view the ‘in links’ to a particular URL. Well, generally the quickest way is by clicking on the URL in question in the top window and then using the ‘in links’ tab at the bottom, which populates the lower window pane (as discussed in our guide on finding broken links).

But, sometimes, it’s not always that simple. For example, there might be a relative linking issue, which is causing infinite URLs to be crawled and you’d need to view the ‘in links’ of ‘in links’ (of ‘in links’ etc) many times, to find the originating source. Or, perhaps a page wasn’t discovered via a HTML anchor, but a canonical link element.

This is where the ‘crawl path report’ is very useful. Simply right click on a URL, go to ‘export’ and ‘crawl path report’.

crawl path report

You can then view exactly how a URL was discovered in a crawl and it’s shortest path (read from bottom to top).

crawl path export

Simple.

4) AJAX Crawling

I’ve been asked quite a few times when we will support crawling of JavaScript frameworks such as AngularJS. While we don’t execute JavaScript, we will crawl a site which adheres to the Google AJAX crawling scheme.

You don’t need to do anything special to crawl AJAX websites, you can just crawl them as normal. We will fetch the ugly version and map it to the pretty version, just like Google.

AJAX Crawling

You can view this all under the ‘AJAX’ tab obviously.

There’s also a ‘with hash fragment’ and ‘without hash fragment’ filters for this tab. This can be useful to identify AJAX pages which only use the meta fragment tag and hence require Google to double crawl (to crawl the page, see the tag and then fetch the ugly version) which can put extra load on your servers.

5) Page Title & Meta Description Editing Via The SERP Snippet Emulator

We developed a SERP snippet emulator in the lower window tab, which allows you to edit page titles and meta descriptions directly in the SEO Spider.

It can be pretty challenging to work out pixel widths in Excel, so you can just edit them directly in the tool and they will update in the interface, to show you pixel width and how they might be displayed in the search results.

serp snippet editing

Any changes made are automatically saved in the tool (unlesss you use the ‘reset’ button), so you can make as many edits as you like, then export and send diretly to the client to approve or upload when you’ve finished.

6) SERP Snippets By Device

We all know that 2015 (like every year, for the last 5+ years) is ‘year of the mobile’, so did you know that pixel width cut off points are different when viewed on mobile & tablet devices, than desktop?

You can switch device type to get a better idea of how they might display across all devices.

serp snippet on mobile

The ‘Mobile Friendly’ prefix will be available in our next iteration of the SEO Spider due for release shortly (I am using a beta in the screenshot).

7) SERP Mode

While we are on the subject of SERP snippets, if you switch to ‘SERP mode’, you can upload page titles and meta descriptions directly into the SEO Spider to calculate pixel widths (and character lengths!). There is no crawling involved in this mode, so you can test them before they are put live on a website.

serp mode

8) XML Sitemap Auditing

Did you know that you didn’t need to convert your XML sitemap into a list of URLs for us to crawl it? You can simply save the XML sitemap and upload it in list mode and we will crawl the XML format natively.

xml sitemap crawling

A quick and easy way to ensure you don’t have a dirty sitemap, to avoid the search engines reducing their trust in it.

9) Canonical Errors

By default the SEO Spider will crawl canonicals for discovery and these URLs will appear as usual within the respective tabs. However, there’s also a canonical errors report under the ‘reports’ menu, which can be easy to miss and provides a quick summary of canonical related errors and issues.

canonical errors

The report will show canonicals which have no response, 3XX redirect, 4XX or 5XX error (anything other than a 200 ‘OK’ response) and highlight any URLs discovered only via a canonical, that are not linked to internally from the sites own link structure (in the ‘unlinked’ column when ‘true’).

10) Crawling Authenticated Staging Sites

This feature isn’t very obvious, but essentially the SEO Spider supports basic and digest authentication, meaning you can crawl staging sites or development servers, which require a login to access.

Often sites in development will also be blocked via robots.txt as well, so make sure this is not the case or use the ‘ignore robot.txt configuration‘, insert the staging site URL and crawl.

authentication

At this point an authentication box will pop up automatically (as shown above) and ask you to insert the username and password credentials to crawl.

Your Turn

Are there any hidden away features I’ve missed? If so, please do share them in the comments.

We hope to have a new version of the Screaming Frog SEO Spider ready for release within the next two weeks, so expect a lot more to come, too.