
In May, Google introduced the Web Vitals as a uniform evaluation basis for websites. Now Google employee Martin Splitt explains how these are measured.
In the past, Google used a number of different tools and metrics to evaluate websites. To finally create a uniform standard, Google introduced the Web Vitals in May. These three metrics indicate the stability, loading time and interactivity of websites. On Twitter, Google employee Martin Splitt now provided more extensive information on determining these key figures:
Here’s my lenghty answer 🙂 Grab a seat, make yourself at home! Let’s dissect this question into:
– Googlebot measuring things
– Dynamic rendering and performance
– Performance and UX
– rum vs. lab data
– GSC Web Vitals reportOff we go!
– ? Splitti is on vacation ? (@ g33konaut) August 4, 2020
To collect web vitals, Google accesses real user data (for example from the Chrome User Experience Report) and laboratory data that is recorded during rendering. A Google User Agent is always used. If a website uses dynamic rendering, there may be slight delays due to the creation of static HTML. This can be avoided by caching and warming up the cache beforehand. Splitt points out that there could be differences between the performance for users and for the Google bot.
3 /? So that’s something to be aware of:
1.) We can’t just rely on the data Googlebot measures for your performance
2.) Your users (those you should really care about, btw) might suffer a slow experience and you wanna fix that – because slow websites aren’t great for users.– ? Splitti is on vacation ? (@ g33konaut) August 4, 2020
Site operators should therefore not rely solely on the data from the Google bot. It is also not enough to rely on tools like Pagespeed Insights or Lighthouse. These work with laboratory data that provide hypothetical performance data for an ideal environment. They therefore do not represent the actual user experience, but provide starting points for more detailed analyzes.
6/6 And that’s the cool thing about the GSC web vitals report – it shows you limited data (b / c not every URL might have enough rUM data) but it’s real user metrics! So if something is “poor” there, it means real users suffered. You wanna fix that for sure.
– ? Splitti is on vacation ? (@ g33konaut) August 4, 2020
Splitt recommends site operators to use the report on the Web Vitals in the Google Search Console. URLs that are shown here as too slow indicate actual usage problems. Even if there is not enough user data available for every URL.
For further reading: