Add Your Heading Text Here

Table of Contents

TL;DR: Want your website to rank higher in search results? Focus on PageSpeed Insights (PSI) and its Core Web Vitals (LCP, INP, and CLS) scores. Looking to boost user experience and conversions? Dive into the diagnostic insights from Lighthouse. Both PSI and Lighthouse offer valuable data for website optimization, but understand their key differences in scoring metrics and data sources.

It is easy to confuse PageSpeed Insights scoring and metrics with Lighthouse’s scoring and metrics. They are, however, quite different in what they measure and the impact each score or metric has on the performance of a website – both for loading times and organic search results. Let’s take a look at scores and metrics of both:
Using PageSpeed Insights via https://pagespeed.web.dev/ provides a clear distinction of the two. The top half that is shaded is PageSpeed Insights and the bottom half that is not shaded is Lighthouse.

PageSpeed Insights Scoring

A couple of distinctions. The PageSpeed Insights scoring is summed up as a Passed/Failed as noted by the Core Web Vitals Assessment subheading. The Lighthouse is more nuanced than PSI by providing a Performance score from 0-100.

The pass/fail approach for PSI calculates the passing or failing scores from three primary metrics: Largest contentful paint (LCP), interaction to next paint (INP), and cumulative layout shift (CLS). There is a toggle option in the PSI report “expand/collapse view” that will provide information for what is considered a passing score for each metric.

Here are those scores at a glance

  • Largest contentful paint (LCP): 75% of website urls need to load the LCP metric in less than 2.5s
  • Interaction to next paint (INP): 75% of website urls need to load the INP metric in under 200ms.
  • Cumulative layout shift (CLS): 75% of website urls need to maintain a CLS score under 0.1.

It’s worth pointing out that first contentful paint (FCP), first input delay (FID), and time to first byte (TTFB) are not core web vitals, and therefore are not rankings signals for organic search.

Core Web Vitals are The Core of PSI Scoring

Core Web Vitals (CWVs) are a set of metrics Google uses to measure crucial aspects of real-world user experience. Introduced in 2020, they are now a significant factor in search rankings. These metrics include:

  • Largest Contentful Paint (LCP): Measures how quickly the main content of a page loads
  • Interaction to Next Paint (INP): Assesses how responsive a page is to user interactions (clicks, taps, etc.)
  • Cumulative Layout Shift (CLS): Tracks how much a page’s layout unexpectedly shifts or moves, causing frustrating experiences.

Lighthouse Scoring

The bottom half of the website performance audit shows the Lighthouse performance score. There is a lot more nuance in this performance score and it’s easy to mix up the pass/fail option of PSI and the Lighthouse scoring metrics, but they are distinct.

There are three metrics that make up the PSI core web vitals pass/fail: LCP, INP, and CLS. There are five metrics that make up the performance score, first contentful paint (FCP), speed index (SI), largest contentful paint (LCP), total blocking time (TBT), and cumulative layout shift (CLS). Comparing the two, we can note that only two of the five listed are core web vitals.

  • First Contentful Paint (FCP): Measures the time it takes for a browser to render the first piece of content (text, image, etc.) on a page.
  • Speed Index (SI): Measures how quickly content visually populates during page load.
  • Total Blocking Time (TBT): Tracks the total time between FCP and when the page is fully interactive for a user.

Under the performance score there is a link to “see calculator.” Following that link will display how the performance score is calculated. The five metrics and displayed with their weighting and scoring:

Lighthouse weighted scoring metrics

Lighthouse scoring metricWeight of metric
First contentful paint (FCP)10%
Speed index (SI)10%
Largest contentful paint (LCP)25%
Total blocking time (TBT)30%
Cumulative layout shift (CLS)25%

Checking that against the PSI information we can see that only 50% of the Lighthouse performance score overlaps with the PSI core web vitals information. 

PageSpeed Insights vs Lighthouse conclusion for scores and metricsPSI and Lighthouse use different methods of scoring. PSI uses pass/fail and Lighthouse uses a 0-100 point scale. While both can feel the same they only have 50% overlap between the metrics they evaluate.

PSI vs Lighthouse: data sources

Another item to consider when evaluating PSI vs Lighthouse is where their data is being pulled from. It is another common misconception that PSI and Lighthouse us the same data, but this is not the case. 

PSI uses CrUX data

PageSpeed Insights (the top half) uses CrUX data. CrUX stands for Chrome User Experience Report – the R is silent. This is the official data set of the web vitals program (not just the core web vitals program). This data set represents all of the user data for core web vitals. Chrome as a browser is collecting all of these individual web vital metrics every time a web page is loaded. In practice, this means that your Chrome browser is constantly evaluating every web page that you visit and is recording the data.

On a positive note, Google makes the CrUX data available to everyone for use to interact with via API, dashboard, or how we’re using it through PageSpeed Insights. A couple of additional points on this data. It is real world which means what you’re looking at is exactly what users are experiencing in aggregate. Some users might be fast or other slower, these numbers represent the averages. Additionally, a lot of different network connections (4G, 5G, wifi, fiber, broadband, etc…) and devices (old mobile phones, new iPhones, laptops, desktops, TVs, etc…) over the last 28 day period. Only the 28 days is available in PSI, but if you use the dashboard you can get historical information. Here’s the dashboard: https://developer.chrome.com/docs/crux/dashboard.

PSI CrUX Considerations

There are some things to keep in mind when dealing with CrUX data. The first is that if you have a low trafficked site or a newer site, the evaluation information might not be there. Because this uses real users data it requires a minimum number of users to visit the site before information can be displayed. Google’s documentation isn’t clear on how many people are needed, just that there needs to be enough to display the data. Here’s what PSI documentation says “In order to show user experience data for a given page, there must be sufficient data for it to be included in the CrUX dataset. A page might not have sufficient data if it has been recently published or has too few samples from real users. When this happens, PSI will fall back to origin-level granularity, which encompasses all user experiences on all pages of the website. Sometimes the origin may also have insufficient data, in which case PSI will be unable to show any real-user experience data.” This leads into the toggle option of “origin” or “this url.” Sometimes the individual URL that is being tested doesn’t have enough information for PSI to calculate scores or metrics. If that is the case PSI will default to using the whole domain, or origin. Origin = root domain. For example, if www.yoursite.com/category/product doesn’t have enough information PSI will default back to www.yoursite.com/ as a URL to test. This is done because more information is available for the origin, or root domain.

PSI CrUX Summary

To sum it up, PSI information is real world data taken from Chrome as people visit the website. The data shown here is the best information on how people are experiencing your site. Data is calculated on a 28-day cycle in the past so you can’t expect any changes to your website to be immediately reflected there.

Lighthouse Data

Lighthouse data is quite different than PSI data. As a reminder Lighthouse data has only a 50% overlap for metrics. Only largest contentful paint (LCP) and cumulative layout shift (CLS) are represented in Lighthouse and PSI. Additionally, Lighthouse data is what is called lab data, that is, it’s a simulation. What this means in practice is that we can get a lot more information, like the diagnostic information below the scores and metrics.

The long list of diagnostics often will look like this:

It is because this page is loaded in a lab environment that each metric and issue can be evaluated precisely. If you toggle into one of the diagnostic tabs, “Reduce JavaScript execution time,” in this example you are able to see a list of JavaScript resources that are loading, how long each script takes to be evaluated, how long parsing takes, and ultimately how much CPU time each resources takes to load. This information is extremely detailed and useful if we know what to do with it. This detailed information isn’t available in the CrUX dataset we have from PSI taken from the field. The level of detail observed in Lighthouse data is one of the distinguishing features of lab data, very detailed and actionable information.

The detailed and actionable information does come at a cost. This isn’t likely what most of your users are experiencing. Lab data comes with a specific group of settings. You can see exactly what those settings are in the Lighthouse diagnostic section in the light gray:

You’re able to see exactly what time the diagnostic was run, what network it ran on, what device was used, and even what version of Chrome was simulated.  Comparing that to the PSI CrUX information is difficult because it’s 1 session vs All traffic sessions. It’s a hard comparison to make.

Lighthouse data considerations

There are some downsides to using lab data, primarily that it’s in a lab. Lab data is highly controlled and rarely translates into the real world. Simulating a 4G connection on an iPhone SE will only ever be a simulation and not the real thing. Here are a few considerations that are not available in a simulation:

  • What battery mode is the phone in? Power saving mode isn’t going to have the same quality of connection that a performance mode will.
  • Are there updates being downloaded in the background? If bandwidth is being taken up for an update 4G will feel like 3G.
  • How many applications are currently running? The more applications that are open, the slower the connection will be.
  • Is the physical area densely populated? Being downtown means a lot more people will be fighting for the connection compared to suburbs where there are not as many devices trying to utilize the network. The rise of budget providers also plays a role. Companies like Mint Mobile or Spectrum Mobile utilize other providers towers, but at a secondary tier of preference. This means that a Verizon user will get preference of a Spectrum user. If you’re a budget mobile user (like I am) your experience will likely be different than a Verizon user.

These are just a few considerations that are not evaluated in lab data. There are others like caching, or geographic location that are not considered, but play a massive role in performance. Additionally, lab settings don’t account for different user behaviors like quickly closing a tab or scrolling rapidly, which can impact real-world performance.

The sum total of things not evaluated in lab data makes it a poor choice for reporting on the overall user experience of live websites. I do not recommend using Lighthouse, or any lab data, as a reporting metric on live websites. There are too many variables not considered. The lab dad is, however, extremely useful in identifying ways to make the website faster.

When to use Lighthouse data

Lighthouse brings a very different datasource from PSI, but with a different set of benefits. If you’re build a website and are looking for way to make it faster you’ll want to utilize the diagnostic features because of the detailed information. Even if you’re currently looking to improve a live website, Lighthouse data is a great place to evaluate where you can remove resources that are causing slow load times.

I would not recommend using Lighthouse data for standard reporting, use PSI for that. The CrUX data is better for reporting because its focused on users, real people, Lighthouse data is a computer evaluating another computer.

PageSpeed Insights vs Lighthouse: Conclusion

PageSpeed Insights and Lighthouse are both essential tools in your website optimization arsenal. For improving search rankings and the overall user experience, prioritize the Core Web Vitals scores in PageSpeed Insights. To uncover specific performance bottlenecks impacting speed, delve into Lighthouse’s detailed diagnostics. By strategically using both, you’ll create a website that’s both fast and enjoyable for your visitors.