Issues

JavaScript: Pages with Blocked Resources

back to issues overview

Pages with Blocked Resources

Pages with resources (such as images, JavaScript and CSS) that are blocked from rendering by robots.txt or an error.

This filter will only populate when JavaScript rendering is enabled (blocked resources will appear under ‘Blocked by Robots.txt’ in default ‘text only’ crawl mode).

This can be an issue as the search engines might not be able to access critical resources to be able to render pages accurately.

How to Analyse in the SEO Spider

Enable JavaScript rendering mode via ‘Config > Spider > Rendering’ and select ‘JavaScript’ to crawl JavaScript websites.

View URLs with this issue in the ‘JavaScript’ tab and ‘Pages with Blocked Resources’ filter.

Blocked resources can be viewed by URL in the lower ‘Rendered Page’ tab, or in bulk via ‘Response Codes > Blocked Resource’.

They can be exported in bulk via ‘Bulk Export > JavaScript > Pages with Blocked Resources’.

Read our tutorial on ‘How To Crawl JavaScript Websites‘.

What Triggers This Issue

This issue is triggered when pages have resources such as images, JavaScript, and CSS that are blocked from rendering due to restrictions in the robots.txt file or because of an error.

For example, a directive like:

Disallow: /wp-includes/

In the robots.txt file, which would block search engines from crawling any resources that begin with:

https://www.screamingfrog.co.uk/wp-includes/

Such as:

https://www.screamingfrog.co.uk/wp-includes/js/jquery/jquery.min.js

How To Fix

Update the robots.txt and resolve any errors to allow all critical resources to be crawled and used for rendering of the websites content. Resources that are not critical (e.g. Google Maps embed) can be ignored.

Further Reading

Back to top