SEO Spider
How To Audit Core Web Vitals
How to Audit Core Web Vitals Using the SEO Spider
You can use the Screaming Frog SEO Spider and PageSpeed Insights API to easily measure Core Web Vitals (CWVs) for each page on your website.
Below is a brief introduction to Core Web Vitals and how to integrate them into your crawl results.
What Are Core Web Vitals?
Core Web Vitals are a set of real-world metrics Google use to measure key aspects of user experience when loading a webpage.
The three metrics Google use are:
Largest Contentful Paint
Largest Contentful paint (LCP) is a measure of the overall loading speed of a page – the faster the page loads the better. It’s marked in the timeline when the majority of content is loaded in.
Below You can see an illustration of a page load, with the First Contentful Paint (FCP) and Largest Contentful Paint (LCP):

Interaction to Next Paint
Interaction to Next Paint (INP) measures how quickly your website reacts to user interactions using data from the Event Timing API.
INP assesses the latency of any click, tap, and or interactions with a page throughout its lifespan, and reports the longest duration, ignoring outliers.
The goal of INP is to minimize the time it takes for the next frame to be painted after a user interaction. A low INP means the page is consistently able to respond quickly to user interactions.
Cumulative Layout Shift
The Cumulative Layout Shift (CLS) of a page measures the visual stability of a page as it loads. We’ve all had that frustrating experience of reading a news site and having the article text jump lower as the navigation is loaded, and CLS is a measure of that across a whole page.
Below you can see an illustration of this as the page loads. Through each step content is being added, causing the previous elements to be shifted up and down:

Core Web Vitals Assessment
Core Web Vitals are based on real-world user metrics supplied by the Chrome User Experience Report (CrUX). Though it is possible to use simulated lab data to gauge performance when real user data is unavailable. This CrUX data is based on the last 28 days of user visits to a page.
Each Web Vital is benchmarked by three classifications; Good, Needs Improvement, and Poor.
For a page to ‘pass’ the Core Web Vital Assessment it must be considered ‘Good’ in all three metrics.
Passing the Core Web Vital Assessment will provide a page with the maximum ranking benefit.
Google has provided a handy illustration of the thresholds for each:
- Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP must occur within 2.5 seconds of when the page first starts loading.
- Interaction to Next Paint (INP): measures interactivity. To provide a good user experience, pages must have a INP of 200 milliseconds or less.
- Cumulative Layout Shift (CLS): measures visual stability. To provide a good user experience, must should maintain a CLS of 0.1. or less.
How to View a Page’s Core Web Vitals
Your site’s Core Web Vitals are recorded and stored within the Chrome User Experience Report, there are various APIs to connect with this, but you can most commonly see them in two places.
For an individual URL you can use PageSpeed Insights (PSI), which flags whether or not the page has passed the Core Web Vitals assessment:
Alternatively, you can see a sample of pages within Google Search Console (GSC), which will show performance over time:
However, in PSI you are limited to one URL at a time, and in GSC you can only see whether a page is marked good, needs improvement or bad, and not the individual scores. In many cases pages in GSC are grouped, and these often do not have enough traffic to have individual CWVs provided.
How To Analyse In The SEO Spider
Instead, using the Screaming Frog SEO Spider and PageSpeed Insights API we can collect Core Web Vital data en masse across each page on the site (for both CrUX and lab data).
To set this up just follow the steps below:
1. Connect to the PageSpeed Insights API
Start the Screaming Frog SEO Spider and go to ‘Configuration > API Access > PageSpeed Insights’, enter a free PageSpeed Insights API key:

You can generate a free PageSpeed Insights API key by logging into your Google account and heading to their get started page.
2. Select Your Metrics
Once connected to the PSI API, click on the ‘Metrics’ tab. First, select whether you would like either mobile or desktop data.
Next, select exactly what data from PageSpeed Insights you would like reported for each URL. We recommend using the default selection as this will include data from the CrUX report, lab data from Google Lighthouse, and opportunity data for areas to improve. But you can select or remove each metric to suit your needs.
If you are unsure of exactly what data you will want, you can select everything and then all the data will be present for whichever reports you want to review later.
If selecting manually, Core Web Vitals can be found under CrUX Metrics, and lab data can be found under Lighthouse Metrics:
3. Crawl the Website
Open up the SEO Spider, type or copy in the website you wish to crawl in the ‘Enter URL to spider’ box and hit ‘Start’.
Alternatively, upload a set of URLs to analyse in List mode (Mode > List):

The website will be crawled, and CWV data will be incorporated via the PSI API. So simply wait until both the crawl and API progress bars have reached 100%.
4. View the PageSpeed Tab
Click on the PageSpeed Tab as this will show all discovered URLs which have speed data reported from the API:
The PSI Error and PSI Status columns will indicate if the API failed to fetch any data for particular URLs. See our FAQ’s for more information.
5. Analysing the Results
Once in the PageSpeed tab, if you scroll to the right, you’ll see the CrUX metrics collected from the API:
The first column you will find (if enabled) is the ‘Core Web Vitals Assessment’. This will be marked as either a Pass or a Fail depending on whether It is considered Good in all three Web Vitals.
In the screenshot above we can see the top URL has a LCP of 2.1s (under 2.5s), a CLS of 0.01 (under 0.10), and an INP of 181ms (under 200ms). Therefore this URL is considered Good in each metric and passes the CWV assessment.
Here you should examine which of your pages are failing and what metric they are failing in. Then attempt to improve the scoring with site enhancements.
If data isn’t populated against URLs for CWV, then it will be because there isn’t sufficient real-world speed data as the page doesn’t have enough visitors. Only the most popular pages generally have sufficient real-world field data in the Chrome UX report.
How To Improve Core Web Vitals
There are various areas you can examine to improve your core web vitals performance, many of which you can view directly within the SEO Spider
A total of 19 page speed insights can be viewed directly in the PageSpeed tab courtesey of Lighthouse:
You can select a filter to view pages that can be improved for each speed issue. If you then highlight one URL and select the lower Lighthouse Details tab, you’ll see more information on the page – including all opportunities for that page, and individual resources for each opportunity:
Lastly, you can bulk export opportunity data across the site using the top menu ‘Reports > PageSpeed’ dropdown.
How to Optimise Largest Contentful Paint
Many factors can affect a page Largest Contentful Paint and generally anything that improves overall loading times will decrease the LCP. Speed issues in the SEO Spider include:
LCP Request Discovery – Optimize LCP by making the LCP image discoverable from the HTML immediately, fetchpriority=high should be applied – and avoid lazy-loading.
Opportunity filters to view:
Image Optimisation – Large files, such as images can greatly increase the overall size of a page and the time taken to download all assets. Ensure each image is properly sized, using a suitable level of compression, and consider using more efficient images formats such as WebP.
Opportunity filters to view:
Document Request Latency – If the initial HTML loads slowly, this impacts all subsequent resource loads. Ensuring a fast server response, avoiding redirects and compressing the HTML all help.
Opportunity filters to view:
Render-Blocking Resources – As the HTML is being parsed, any stylesheets or CSS encountered will cause the parser to pause, delaying the LCP. Ideally, you would defer any non-critical CSS and JavaScript to speed up loading of your main content.
Opportunity filters to view:
Avoid Chaining Critical Requests – Avoid chaining critical requests by reducing the length of chains, reducing the download size of resources, or deferring the download of unnecessary resources to improve page load.
Opportunity filters to view:
Optimise Your Resources – Alongside blocking the render, files such as JavaScript, CSS, and fonts etc. each takes time to download and process. Try minifying and removing any unnecessary files from your page load.
Opportunity filters to view:
- Minify CSS
- Minify JavaScript
- Reduce Unused CSS
- Reduce Unused JavaScript
- Use Efficient Cache Lifetimes
- Legacy JavaScript
- Duplicated JavaScript
How to Optimise Interaction to Next Paint
Interaction to Next Paint can be affected by long execution times of large scripts, so try and minimise this wherever possible. Some areas to examine initially include:
Optimise Your JavaScript – By removing unnecessary parts of scripts, (or entire scripts) you can reduce the time spent executing and rendering these files. This also includes any duplicate modules within a bundle and legacy JavaScript that’s no longer required in modern browsers.
Opportunity filters to view:
Minimise Main Thread Work – If a browser is busy rendering the main thread, then user interactions will be delayed until this is completed. If you reduce the main thread execution time, user interactions will feel more responsive and immediate.
Opportunity filters to view:
How to Optimise Cumulative Layout Shift
Most improvements to CLS are made by ensuring a page is rendered in the appropriate order and with defined spacing for each asset. Some initial areas to examine include:
Use Height & Width Attributes on Resources – All images, ads, videos, and iframes should have defined <height> and <width> attributes. This will ensure that pages load with the appropriate spacing for these resources, rather than shift other content as they are added.
Load Content Downwards – Prioritise page load from top to bottom so content isn’t added above and pushes other assets downwards – this is commonly seen with cookie banners.
Ensure Custom Fonts Aren’t Causing FOIT/FOUT – If custom fonts are applied late in a page load, this can cause text to flash as it’s replaced (FOUT), or invisible text is displayed until the custom font is rendered (FOIT). Ideally, any custom fonts would be preloaded to ensure they are applied to text as it’s added to the page.
Avoid Non-Composited Animations – Non-composited animations can appear janky on lower-end devices or when intensive tasks are running simultaneously to page load, causing layout shifts. To avoid this, use high performance, compositor-only animations.
Opportunity Filters to view:
The ‘Missing Size Attributes’ filter from the ‘Images’ tab can also be used to identify possible problem images.
Useful Tips
- The ranking benefit from Web Vitals will not be binary. Meaning, a page within the ‘needs improving’ classification would see more ranking benefit than a worse performing page still within the same ‘needs improving’ classification. However, all pages classified as ‘Good’ will receive the same ranking benefit.
- If your URLs aren’t showing any CrUX data and the PSI status lists ‘success’ then it is likely the URL doesn’t receive enough real user visits to generate sufficient speed data for the CrUX report. You can verify this by running the URL in the PageSpeed Insights, where it will display a message. In this case, you should use a similar page (in layout) that does have CrUX data, or you can use simulated lab data instead. Lab data can be found under the ‘Lighthouse’ dropdown of the Metrics tab (this is enabled by default).
- You may also receive partial field data when some metrics are available but others are not.
- When using lab data, the only metric unavailable is INP (as this requires ‘timespan’ mode and user interaction). Instead, you can substitute this with Total Blocking Time (TBT), as a gauge of how often resources are being blocked, which would increase the INP.
- Origin data is also available, which indicates the average value of a web vital across the domain. However, it’s our understanding that Google will treat pages individually, or use vitals from similar pages if no data is available.
- If analysing a large site, you may hit the PSI quota limit. In these instances, you may need to wait for it to reset, or analyse a smaller sample of pages. You can view your current quota limit here.
- CrUX data is based on the last 28 days of user interactions, so any changes to the site will take 28 days to be reflected within CrUX. You can still use lab data as a rough measure until that time.
- To see a visual indication of a pages LCP, CLS and long tasks affecting the INP, you can record and view a loading timeline within the ‘Performance’ tab of Chrome Developer Tools. Selecting the Web Vitals checkbox:
Summary
This tutorial will hopefully help you use the SEO Spider to analyse and improve your Core Web Vitals and more importantly, experience for users.
For further reading we highly recommend –
- Core Web Vitals: How to measure and improve your site’s UX – Nichola Stott
- Learn Core Web Vitals – Google Developers
If you experience any issues after following the guidance above, check out the following FAQs –
- Why Is My PSI Key Invalid?
- Why Do I Receive An Error Status For The PSI API?
- Why Do PSI Scores Differ To A Browser?
- Does The PageSpeed Insights API Affect Analytics?
Alternatively, please contact us via support and we can help.