The Screaming Frog SEO Spider is a small desktop program you can install locally on your PC, Mac or Linux machine which spiders websites’ links, images, CSS, script and apps from an SEO perspective. It fetches key onsite elements for SEO, presents them in tabs by type and allows you to filter for common SEO issues, or slice and dice the data how you see fit by exporting into Excel. You can view, analyse and filter the crawl data as it’s gathered and updated continuously in the program’s user interface.
The Screaming Frog SEO Spider allows you to quickly crawl, analyse and audit a site from an onsite SEO perspective. It’s particulary good for analysing medium to large sites, where manually checking every page would be extremely labour intensive (or impossible!) and where you can easily miss a redirect, meta refresh or duplicate page issue.
The SEO Spider allows you to export key onsite SEO elements (url, page title, meta description, headings etc) to Excel so it can easily be used as a base to make SEO recommendations from. Our video below provides a demonstration of what the SEO tool can do –
A quick summary of some of the data collected in a crawl include –
By downloading, installing and using the Screaming Frog SEO Spider, you agree to the terms and conditions.
The standard ‘Lite’ version of the tool is completely free to download and use. However, this version is restricted to crawling a maximum of 500 URLs in a single crawl and it does not give you full access to the configuration, saving of crawls or the custom source code feature. You can crawl 500 URLs from the same website, or as many websites as you like as many times as you like though!
For just £99 per annum you can purchase an individual licence, which removes the 500 URL crawl limit, opens up the spider’s configuration options and custom source code search feature. Alternatively hit the ‘buy a licence’ button in the SEO Spider to buy a licence after downloading and trialling the software.
As default the SEO Spider crawls sites like Googlebot (it obeys allow, disallow directives and wildcard support like Googlebot), but presents its own user-agent ‘Screaming Frog SEO Spider’, which it will obey specific directives for in robots.txt. If there are no directives, it will crawl your site like Googlebot. while still presenting its own UA.
For more guidance and tips on our to use the Screaming Frog SEO crawler –
If you have any problems or any feedback or feature requests for the SEO Spider, then please just contact us via our support. We have lots planned and currently in development!