SEO Spider

How To Audit Core Web Vitals

How to Audit Core Web Vitals Using the SEO Spider

You can use the Screaming Frog SEO Spider and PageSpeed Insights API to easily measure Core Web Vitals (CWVs) for each page on your website.

Below is a brief introduction to Core Web Vitals and how to integrate them into your crawl results.


What Are Core Web Vitals?

Core Web Vitals are a set of real-world metrics Google will use (in June 2021) to measure key aspects of user experience when loading a webpage.

In addition to being small signals for Googles scoring, they’re also exploring labelling search results with symbols to indicate pages with good or bad CWV scores.

Initially the three metrics Google will use are:

Largest Contentful Paint

Largest Contentful paint (LCP) is a measure of the overall loading speed of a page – the faster the page loads the better. It’s marked in the timeline when the majority of content is loaded in.

Below You can see an illustration of a page load, with the First Contentful Paint (FCP) and Largest Contentful Paint (LCP):

Interaction to Next Paint

Interaction to Next Paint (INP) measures how quickly your website reacts to user interactions using data from the Event Timing API.

INP assesses the latency of any click, tap, and or interactions with a page throughout its lifespan, and reports the longest duration, ignoring outliers.

The goal of INP is to minimize the time it takes for the next frame to be painted after a user interactation. A low INP means the page is consistently able to respond quickly to user interactions.

Cumulative Layout Shift

The Cumulative Layout Shift (CLS) of a page measures the stability of a URL as it loads. We’ve all had that frustrating experience of reading a news site and having the article text jump lower as the navigation is loaded, and CLS is a measure of that across a whole page.

Below you can see an illustration of this as the page loads. Through each step content is being added, causing the previous elements to be shifted up and down.:
Cumulative Layout Shift Timeline


Core Web Vitals Assessment

Core Web Vitals are based on real-world user metrics supplied by the Chrome User Experience Report (CrUX). Though it is possible to use simulated lab data to gauge performance when real user data is unavailable. This CrUX data is based on the last 28 days of user visits to a page.

Initially, these Web vitals will only become part of Googles algorithm on mobile pages, though desktop will also be introduced later.

Each Web Vital is benchmarked by three classifications; Good, Needs Improvement, and Poor.

For a page to ‘pass’ the Core Web Vital Assessment it must be considered ‘Good’ in all three metrics.

Passing the Core Web Vital Assessment will provide a URL with the maximum ranking benefit.


Google has provided a handy illustration of the thresholds for each:

Core Web Vitals Thresholds
  • Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP must occur within 2.5 seconds of when the page first starts loading.
  • Interaction to Next Paint (INP): measures interactivity. To provide a good user experience, pages must have a INP of 200 milliseconds or less.
  • Cumulative Layout Shift (CLS): measures visual stability. To provide a good user experience, must should maintain a CLS of 0.1. or less.

How to View a Pages Core Web Vitals

Your sites Core Web Vitals are recorded and stored within the Chrome User Experience Report, there are various APIs to connect with this, but you can most commonly see them in two places.

For an individual URL you can use PageSpeed Insights (PSI), which flags whether or not the page has passed the Core Web Vitals assessment:

Core Web Vitals Assessment in PSI

Alternatively, you can see a sample of pages within Google Search Console (GSC), which will show performance over time:

Google Search Console Core Web Vitals

However, in PSI you are limited to one URL at a time, and in GSC you can only see whether a page is marked good, needs improvement or bad, and not the individual scores.


How To Analyse In The SEO Spider

Instead, using the Screaming Frog SEO Spider and PageSpeed Insights API we can collect Core Web Vital data en masse across each page on the site (for both CrUX and lab data). To set this up just follow the steps below:

1. Connect to the PageSpeed Insights API

Start the Screaming Frog SEO Spider and go to ‘Configuration > API Access > PageSpeed Insights’, enter a free PageSpeed Insights API key:
Enter PageSpeed Insights API Key
You can generate a free API key by logging into your Google account and heading to their get started page.

For step-by-step instructions on generating an API key click here.

2. Select Your Metrics

Once connected to the PSI API, click on the ‘Metrics’ tab. First, select whether you would like either mobile or desktop data, (remember that CWVs will only apply to Mobile data initially).

Next, select exactly what data from PageSpeed Insights you would like reported for each URL. We recommend using the default selection as this will include data from the CrUX report, lab data from Google Lighthouse, and opportunity data for areas to improve. But you can select or remove each metric to suit your needs.

Note, some metrics (such as the Core Web Vitals Assessment) are only available in Version 14.2 onwards.

If selecting manually, Core Web Vitals can be found under CrUX Metrics, and lab data can be found under Lighthouse Metrics:

Core Web Vitals Metrics

3. Crawl the Website

Open up the SEO Spider, type or copy in the website you wish to crawl in the ‘Enter URL to spider’ box and hit ‘Start’.

Alternatively, upload a set of URLs to analyse in List mode (Mode > List):
Crawl the site for pagespeed metrics
The website will be crawled, and CWV data will be incorporated via the PSI API. So simply wait until both the crawl and API progress bars have reached 100%.


4. View the PageSpeed Tab

Click on the PageSpeed Tab as this will show all discovered URLs which have speed data reported from the API:

PageSpeed tab

The PSI Error and PSI Status columns will indicate if the API failed to fetch any data for particular URLs. See our FAQ’s for more information.


5. Analysing the Results

Once in the PageSpeed tab, if you scroll to the right, you’ll see the CrUX metrics collected from the API:

Core Web Vitals Assessment

The first column you will find (if enabled) is the ‘Core Web Vitals Assessment’. This will be marked as either a Pass or a Fail depending on whether It is considered Good in all three Web Vitals.

In the screenshot above we can see the top URL has a LCP of 2.1s (under 2.5s), a CLS of 0.01 (under 0.10), and an INP of 181ms (under 200ms). Therefore this URL is considered Good in each metric and passes the CWV assessment.

Here you should examine which of your pages are failing and what metric they are failing in. Then attempt to improve the scoring with site enhancements.

If data isn’t populated against URLs for CWV, then it will be because there isn’t sufficient real-world speed data as the page doesn’t have enough visitors. Only the most popular pages generally have sufficient real-world field data in the Chrome UX report.


How To Improve Core Web Vitals

There are various areas you can examine to improve your web vital performance, many of which you can view directly within the SEO Spider by enabling the opportunities within the PageSpeed Insights API Settings.

If enabled you can view various filters for each opportunity in the PageSpeed tab:

Selecting one of these filters will show all pages which could be improved with said enhance alongside the time savings they might bring. If you then highlight one URL and select the lower PageSpeed Details tab you’ll see more information on the page – including all opportunities for that page, and individual resources for each opportunity:
PageSpeed Details in the SEO Spider
Lastly, you can bulk export opportunity data across the site using the top menu ‘Reports > PageSpeed’ dropdown.


How to Optimise Largest Contentful Paint

Many factors can affect a page Largest Contentful Paint and generally anything that improves overall loading times will decrease the LCP. Some areas to examine include:

Image Optimisation – Large files, such as images can greatly increase the overall size of a page and the time taken to download all assets. Ensure each image is properly sized, and using lazy loading where appropriate. You may also want to consider newer, more efficient image formats such as WebP, or improve the encoding of existing image files.

You can view many of these opportunities in the appropriate PageSpeed filters within the SEO Spider:

  • Properly Size Images
  • Defer Offscreen Images
  • Efficiently Encode Images
  • Serve Images in Next-Gen Formats

Improve Server Response Times – A slow server will lead to slow responses and increased load times. Try and minimise this response time wherever possible. This can usually be done with improved hosting or use of Content Delivery Networks (CDNs).

Opportunity filters to view:

  • Reduce Server Response Time (TTFB)

Remove Render-Blocking Resources – As the HTML is being parsed, any stylesheets or CSS encountered will cause the parser to pause, delaying the LCP. Ideally, you would defer any non-critical CSS and JavaScript to speed up loading of your main content.

Opportunity filters to view:

  • Eliminate Render-Blocking Resources

Optimise Your Resources – Alongside blocking the render, files such as JavaScript, CSS, and fonts etc… each takes time to download and process. Try minifying and removing any unnecessary files from your page load.

Opportunity filters to view:

  • Minify CSS
  • Minify JavaScript
  • Remove Unused CSS
  • Remove Unused JavaScript
  • Enable Text Compression
  • Preconnect to Required Origins
  • Preload key Requests
  • Use Video Formats for Animated Content

How to Optimise Interaction to Next Paint

Interaction to Next Paint can be affected by long execution times of large scripts, so try and minimise this wherever possible. Some areas to examine initially include:

Optimise Your JavaScript – By removing unnecessary parts of scripts, (or entire scripts) you can reduce the time spent executing and rendering these files. This also includes any duplicate modules within a bundle and legacy JavaScript that’s no longer required in modern browsers.

Opportunity filters to view:

  • Minify JavaScript
  • Remove Unused JavaScript

Reduce the Number of Third-Party Scripts Used – By removing any unnecessary plugins or unused scripts you’ll have less data to transfer and fewer assets to render. In some instances, third-party code can be lazy-loaded to reduce the impact on the initial page load.

You can examine the total number of third party resources used on a page within the Overview tab of the lower PageSpeed details.

Minimise Main Thread Work – If a browser is busy rendering the main thread, then user interactions will be delayed until this is completed. If you reduce the main thread execution time, user interactions will feel more responsive and immediate.

Opportunity filters to view:

  • Minimise Main-Thread Work
  • Reduce JavaScript Execution Time

Lower Resource Counts & Sizes – Each resource needed for a page will take time to be requested, downloaded, and executed. By reducing both the number of requests needed and total download size, you will speed up the execution of the page.

Opportunity filters to view:

  • Minify CSS
  • Minify JavaScript
  • Remove Unused CSS
  • Remove Unused JavaScript
  • Enable Text Compression

How to Optimise Cumulative Layout Shift

Most improvements to CLS are made by ensuring a page is rendered in the appropriate order and with defined spacing for each asset. Some initial areas to examine include:

Use Height & Width Attributes on Resources – All images, ads, videos, and iframes should have defined <height> and <width> attributes. This will ensure that pages load with the appropriate spacing for these resources, rather than shift other content as they are added.

Load Content Downwards – Prioritise page load from top to bottom so content isn’t added above and pushes other assets downwards – this is commonly seen with cookie banners.

Ensure Custom Fonts Aren’t Causing FOIT/FOUT – If custom fonts are applied late in a page load, this can cause text to flash as it’s replaced (FOUT), or invisible text is displayed until the custom font is rendered (FOIT). Ideally, any custom fonts would be preloaded to ensure they are applied to text as it’s added to the page.

Avoid Non-Composited Animations – Non-composited animations can appear janky on lower-end devices or when intensive tasks are running simultaneously to page load, causing layout shifts. To avoid this, use high performance, compositor-only animations.

Opportunity Filters to view:

  • Ensure Text Remains Visible During Webfont Load
  • Image Elements Do Not Have Explicit Width & Height
  • Avoid Large Layout Shifts

Useful Tips

  • The ranking benefit from Web Vitals will not be binary. Meaning, a page within the ‘needs improving’ classification would see more ranking benefit than a worse performing page still within the same ‘needs improving’ classification. However, all pages classified as ‘Good’ will receive the same ranking benefit. https://twitter.com/JohnMu/status/1395798952570724352
  • If your URLs aren’t showing any CrUX data and the PSI status lists ‘success’ then it is likely the URL doesn’t receive enough real user visits to generate sufficient speed data for the CrUX report. You can verify this by running the URL in the PageSpeed Insights, where it will display a message. In this case, you should use a similar page (in layout) that does have CrUX data, or you can use simulated lab data instead. Lab data can be found under the ‘Lighthouse’ dropdown of the Metrics tab (this is enabled by default).
  • You may also receive partial field data when some metrics are available but others are not. https://twitter.com/rick_viscomi/status/1403094734990712843
  • When using lab data, the only metric unavailable is INP (as this requires ‘timespan’ mode and user interaction). Instead, you can substitute this with Total Blocking Time (TBT), as a gauge of how often resources are being blocked, which would increase the INP.
  • Origin data is also available, which indicates the average value of a web vital across the domain. However, it’s our understanding that Google will treat pages individually, or use vitals from similar pages if no data is available.
  • If analysing a large site, you may hit the PSI quota limit. In these instances, you may need to wait for it to reset, or analyse a smaller sample of pages. You can view your current quota limit here.
  • CrUX data is based on the last 28 days of user interactions, so any changes to the site will take 28 days to be reflected within CrUX. You can still use lab data as a rough measure until that time.
  • To see a visual indication of a pages LCP, CLS and long tasks affecting the INP, you can record and view a loading timeline within the ‘Performance’ tab of Chrome Developer Tools. Selecting the Web Vitals checkbox:

Summary

This tutorial will hopefully help you use the SEO Spider to analyse and improve your Core Web Vitals and more importantly, experience for users.

For further reading we highly recommend –

If you experience any issues after following the guidance above, check out the following FAQs –

Alternatively, please contact us via support and we can help.

Join the mailing list for updates, tips & giveaways

Back to top