How To Use The Screaming Frog SEO Spider Tool To Audit Backlinks

Graeme Radford

Posted 22 September, 2011 by in Screaming Frog SEO Spider

How To Use The Screaming Frog SEO Spider Tool To Audit Backlinks

For those of you that like to keep an eye on your backlinks and check to see if they are still where they should be, it can be a laborious task. Perhaps you have a list of links from sources which need to be checked that they are still in place or have been removed after a link audit. How can you be sure they are where they should be?

Well, it’s easy to use the SEO Spider to help do this in bulk at scale! Please note, the custom search feature outlined below is only available to licensed users. Here is a quick guide to show how this can be done:

Step 1 – List Your URLs

Get the full list of URLs you want to check in a single txt or CSV file. Make sure you use the full URL (including http:// or https://) of the page that should contain the backlink to your site. Save this list as a .txt or .csv, or have it copied ready to be pasted.

Step 2 – Configure The Custom Source Code Filter

Open up the SEO Spider and navigate to ‘Configuration > Custom > Search’ via the top level navigation.

In the custom filter configuration window, you have several options available and I’m sure you’ll figure out what works best for you, however, my preference would be to select ‘Does Not Contain’ in the filter 1 dropdown, then enter your website URL in the text input field.

does not contain custom search

Please see our user guide for more information about the custom source code search feature.

Step 3 – Upload The URL List

On the top toolbar, click ‘mode’ and select ‘list’. Next, click ‘from a file’ and browse to where you saved your txt or csv of URLs in Step 1 or just paste in the URLs you have copied.

The file reader will tell you how many URLs it found in the file you uploaded. If it was “0”, go back and take another look at the list of URLs and check they are all formatted OK.

upload your backlinks

Once you’ve clicked ‘OK’, the SEO Spider will automatically start crawling the uploaded list of URLs.

Check out our user guide for more information on ‘list mode‘.

Step 4 – Prepare & Crawl

Click the ‘Custom’ tab. Select your chosen filter from the filter drop-down on the left, and view the data in real-time!

Step 5 – Review

In the example I set out above, hopefully, the output will be blank which means that all the backlink locations do contain my site link, however, if some are listed, such as in my example below, you can chase up why your link is no longer present. It looks like we don’t have a link from the Moz homepage!

No backlinks!

I hope this is useful. Now it’s your turn, what different ways do you use the SEO Spider?

Graeme has been in the digital marketing industry for more than 12 years, having previously founded, grown and sold one of Europe’s largest digital marketing agencies. Graeme believes that ‘search’ is both fascinating and rewarding to work on and similarly rewarding for clients.

46 Comments

  • Matt Wilson 12 years ago

    It is not really useful when you leave off a huge detail !!

    “Note: These options are only available to licensed users.”

    Just wasted 10 minutes. THANK YOU!

    Reply
    • screamingfrog 12 years ago

      Hi Matt,

      Apologies for being such a big waste of your time.

      I have included a note in the blog post to make this clear.

      Thanks,

      Dan

      Reply
      • dasd 6 years ago

        :D

        Reply
  • Jeff 11 years ago

    Very helpful guys, thanks. I’ve been using Scrapebox to do the same thing, but have always questioned it’s accuracy, so having a 2nd tool to run the query just means I’ll hopefully get the most accurate info. Also opens up a lot more possibilities that I hadn’t thought of on what I can be using Screaming Frog for, so thanks for that!

    Reply
    • screamingfrog 11 years ago

      Hey Jeff,

      Good to hear it’s helping out! We use this method internally for clients and it saves a ton of time!

      Cheers.

      Dan

      Reply
  • Hi Dan! 11 years ago

    Just wondering if there was a way to view the pages that are linking to the discovered 404 errors? I can see the number of inlinks and outlinks but haven’t figured out how to find the specific source linking to the broken page…

    Thanks so much!
    Brit

    Reply
  • Jack Reid 11 years ago

    Great post Graeme.

    One thing to remember is to check the response codes tab too!

    The custom tab will return 200 response codes, which hopefully most will be. However it is well worth checking for other response codes in the other tabs.

    Cheers,
    Jack

    Reply
    • screamingfrog 11 years ago

      Good shout Jack.

      Obviously the SEO Spider will only be able to determine whether the URL being crawled ‘contains’ or ‘does not contain’ the query string ‘domain.com’, if the URL returns a 200 response.

      If a ‘no response’, 3XX, 4XX & 5XX are returned, the URL will not appear under the custom tab, because the SEO spider does not know if the URL contains or does not contain. Hence, these should all be checked & either recrawled again or essentially included as a ‘does not contain’.

      Reply
  • Adrian Bold 11 years ago

    Thanks for this post Graeme.

    I knew Screaming Frog was a fantastic tool for checking your own site URLs but hadn’t appreciated it could handle this requirement too. What a brilliant ‘bonus’! :-)

    Reply
  • David 11 years ago

    Graeme, thanks for this useful blog post.
    I was wondering if there ae plans to enhance SF to crawl a site or part of a site and get information on Backlinks too, in order to produce a list of urls ordered by No. of referring domains or backlinks… I currently do this with ahrefs and does the job well, but i then find myself bringing that data into SF so it’d be great to everything done with one single tool. Perhaps I am asking for too much :)
    cheers

    Reply
    • screamingfrog 11 years ago

      Hi David,

      It’s certainly been considered internally by us for a longtime and I agree the functionality would be really useful.

      I can’t tell you when it might be available though unfortunately! :-) But it’s on the ‘todo’.

      Thanks for the feedback, appreciated.

      Dan

      Reply
  • Katherine Johnson 11 years ago

    Thanks for the great post.
    So much to learn.

    Reply
  • Emily Jenifer 10 years ago

    I had been using other expensive paid services for my clients to check link profile. this fantastic tool has saved my monthly subscriptions.

    Reply
    • screamingfrog 10 years ago

      Hi Emily,

      Great to hear it’s helping out :-)

      Thanks,

      Dan

      Reply
  • Nathan 10 years ago

    Hello. We are close to purchasing but I cannot get the “List Mode” to work. Entering a URL via Spider gives me results, but entering the same URL via List gives me 404’s or Connection Timeout errors.

    Also, is there are way to solely search for images if we need to? i.e. Filtering out searching for links/ metadata etc.

    Reply
    • screamingfrog 10 years ago

      Hi Nathan,

      List mode will provide the exact same responses as a regular crawl, so it must be the way you uploaded the URLs, timing or something else locally.

      You can filter for images using the filter in the ‘internal’ tab, but obviously it would be impossible to just crawl images, as any web crawler works by crawling hyperlinks. So it wouldn’t be able to discover images, if it couldn’t crawl links and if it could crawl links, but only ‘record’ images in the UI, it wouldn’t be any quicker as it still has to crawl them all.

      Hopefully that makes sense.

      I notice you’ve already contacted support, much easier to discuss there anyway.

      Cheers.

      Reply
  • Mathew 10 years ago

    Hey,

    I’m trying to crawl a list of 100K+ directory backlinks for one of my clients.

    However, trying to upload the list, either takes too long for my patience to allow, or does not work.

    Note: I’ve bumped memory allocation to both Java and Screaming frog to 10Gb (on 16Gb machine). Thinking it might help. To no avail.

    Any tips would be great.

    Thanks,
    Mathew

    Reply
    • screamingfrog 10 years ago

      Hi Mathew,

      For any support queries, it’s much quicker & easier just speaking directly with our support.

      Hence, please can you send through the details (and log files) to support as described here –

      https://www.screamingfrog.co.uk/seo-spider/support/

      Uploading a list shouldn’t take any time, it should be pretty much instant.

      If memory was the issue, it would be during the crawling phase that you’d receive a warning. So sounds like something for us to take a look at!

      Thanks,

      Dan

      Reply
  • Carl Reed 9 years ago

    You guys are great, I’m using this technique (in reverse) to filter through a huge list of toxic links so I can find only the live links before submitting a disavow request to Google.

    (I’m using the “contains” custom filter rather than the “does not contain”)

    Reply
  • Marlon 9 years ago

    We were toying with the idea of using Google Adwords Scripts to send us email notifications of ads containing broken landing pages. We couldn’t get this work.

    So, I downloaded all my ads with their destination URLs, opened SF, selected Mode>list, then ran the query and in seconds, I was able to see 404s! Nice…actually, better than nice – awesome!

    Thanks for such an invaluable tool.

    Reply
  • Krishnan 9 years ago

    We tried to use the list mode by loading couple of website URLs. We are trying to identify the site-wide broken links instead of just the broken links on a page. The tool gave the crawl output for just two home pages instead of the full website pages.

    Whether list mode supports full website crawls for the URLs we provide or just the link pages? Otherwise, do I miss something?

    Cheers,
    Krishnan

    Reply
    • screamingfrog 9 years ago

      Hi Krishnan,

      You should really be using regular ‘spider’ mode if you want to identify sitewide broken links. You could open up two instances of the SEO Spider, & crawl two sites at the sametime for example.

      In list mode, the only URLs crawled are those in the list, so the SEO Spider wouldn’t crawl onwards to find sitewide broken links. You could go into the configuration and untick the ‘limit search depth’ option (which is automatically applied when in ‘list’ mode) to crawl the rest of each website though.

      Cheers.

      Dan

      Reply
  • Gianfranco 9 years ago

    Hi guys, i have a domaing in a test version, this is not a live site yet, is a test site, but i can’t run the screamingfrog, the site is not live yet, but it does have a lot of pages, why is not working with sites that are not live?

    Reply
  • Chris K 9 years ago

    Guys, everything about your software is brilliant but one aspect is rubbish- WHY in gods green earth does the List Mode read the file in the correct order of URLs but then proceeds to crawl them at RANDOM? My problem is that once the crawl is done and I export it back for use, it’s not in the same order as I expect it to be. Needless to say this wouldn’t be a big problem if the URLs were sorted alphabetically etc, but if they aren’t then what am I to do? Can you please let me know A)why this is the way it is B)If you can fix this?

    Thanks much1!!!!!

    Reply
    • screamingfrog 9 years ago

      Hi Chris,

      The spider is multi-threaded and essentially takes the path of least resistance in list mode, some URLs will respond quicker than others etc :-)

      We’ve had this mentioned before and I agree it would be useful to be able to crawl in the same order as upload (or at least, convert back to that order). So we have this on our ‘todo’ list!

      But for now, you will have to use VLOOKUP or perhaps sort your other data alphabetically first and then combine etc.

      Cheers.

      Dan

      Reply
  • PH 9 years ago

    This is awesome. Uncovered a lot here. So many features with this paid license and so well worth the money.

    Reply
  • Kim D 8 years ago

    Does this tool check the inlinks? I have a webpage W and would like to know the pages containing a link pointing to W. Thanks!

    Reply
    • screamingfrog 8 years ago

      Hey Kim,

      It checks the inlinks from your own site. It doesn’t check external backlinks from other websites (I’d recommend Majestic, Moz, Ahrefs etc).

      Cheers.

      Dan

      Reply
  • Chelsea 8 years ago

    I have followed these instructions and I’m still getting 0 results. My URL is correct.

    Reply
  • Bidyut Bikash Dhar 8 years ago

    Hey, It’s awesome. I searched for this kind of tools for a while, but each and every link checker tool was not that up to the mark or not getting the whole thing of it. But your step by step procedure and the tool helped me.. Thank you so very much.

    Reply
  • Shifa Chottani 8 years ago

    Hi ScreamingGraeme,

    Is there really a way to ge the free trial as need to test this feature once before going ahead with the registered version.

    Hoping for a quick and positive response.

    Kind Regards,

    Shifa Chottani

    Reply
    • screamingfrog 8 years ago

      Hey Shifa,

      Thanks for the comment.

      We don’t offer a free trial I am afraid, but we do offer a 30 day guarantee if you’re unhappy with the SEO Spider.

      Cheers.

      Dan

      Reply
      • Shifa Chottani 8 years ago

        Thanks for the quick response. Really appreciate that..

        Is there any t&c applicable for 30 day guarantee? Kindly post a link..

        Thanks

        Reply
  • Rahul Yadav 5 years ago

    I absolutely love Screaming Frog. It’s the first step of my SEO Audits that reach into 80 to 100 point checklists over many other core SEO analysis tools. Great Free tool and even better when u buy it!

    Reply
  • Agence Web Québec 3 years ago

    I totally love Screaming Frog. It’s the initial step of my SEO Audits that venture into 80 to 100 point agendas over numerous other center SEO investigation devices. Incredible Free instrument and far superior when u get it! This is amazing. Revealed a ton here. Endless highlights with this paid permit thus certainly justified regardless of the cash.

    Reply
  • Alex Beige 2 years ago

    Still a really useful article years later! Thank you Dan!

    Reply
  • pouvoir 4 months ago

    Screaming Frog is an indispensable tool in my SEO toolkit. It serves as the cornerstone of my audits, providing a comprehensive analysis that sets the stage for a thorough examination of my site’s SEO health. The fact that it’s free makes it even more appealing, and the premium version is well worth the investment for the extra features and enhanced capabilities. It’s not just a tool; it’s a game-changer that simplifies and accelerates the SEO auditing process. Highly recommend

    Reply

Leave A Comment.

Back to top