Menu

Core Web Vitals: Developer Interview + Benchmarks

Kirill Lavryshev

Soul of the team. Always knows what to do to get the best solution.

We recently delivered a project that involved working with the new Google initiative known as Core Web Vitals. 

Most information available prior was simply the standard SEO copywriting fare. Stack Overflow was not up to par, because the web vitals standard was so new at the time.

We talked with real developers from the rgbcode team and spiced it up with some research, graphics, and numbers.

Project Manager Dmitrii B. spoke with Nikolay Trotsky, a full stack developer from the rgbcode team on the subject. 

Dmitrii: In your own words – what is Core Web Vitals?

Basically, that’s how the website meets modern performance requirements, and how user-friendly it is. 

[This is how] a user can quickly get information, click and the website’s content doesn’t jump and shift all over the screen and the page doesn’t not load for half an hour, etc.

How is it different from the usual speed optimization exactly?

Sometimes the page can load in seconds, but Core Web Vitals still doesn’t pass. 

That is, the page has loaded completely, but then the content jumps all over the place, i.e. an ad banner or something else shifts and hides the menu button. [The] user, instead of clicking on a button, clicks on an ad banner and is redirected to the other website.

Here is what Google has to say about defining Core Web Vitals: 

Core Web Vitals – is a subset of Web Vitals applicable to all web pages that started in May 2021. It provides unified guidance based on factors for assessing the quality of a website. It can be measured both in a lab setting (a controlled environment) – by using tools to simulate a page load, and in the field – when actual real users load and interact with the page. Currently, there are three main metrics:

  • Largest Contentful Paint (LCP). Evaluates loading speed. It should take less than 2.5 seconds to offer a pleasant user experience. 
  • Interactivity. This is measured by the First Input Delay (FID). Pages with a FID of 100 milliseconds or fewer can be called user-friendly.
    • Cumulative Layout Shift (CLS). A visual stability metric. The CLS should be maintained at 0.1.

Google provides the PageSpeed Insights tool to run manual tests. 

As an agency, do we optimize for the page speed alone or switching to broader Core Web Vitals standard? How does it work?

Personally, I now mostly work on the Webpals project that has 10 websites in development. 

A dedicated rgbcode team [has] worked on it for over a year, and everything we build we try to code accordingly so that it works. 

Webpals is a pretty important client. The company has many divisions and the specific one that we work with is related to online betting. Well, not exactly involved with bets themselves, but we mainly work with websites about sports, which have links that redirect to bookmakers’ websites. So, we are engaged in developing sports content websites that a lot of people visit. 

One of these websites has traffic of more than 100,000 visitors per month, and about 5,000 people per day. That’s why web vitals and loading speed in general are so important.

Tools

The conversation then moved to discuss tools used for checking Core Web Vitals.

Manual checks

Is there a single solution, or are there only general recommendations and some type of tools like PageSpeed ​​Insights for checking this information? 

The majority (90%) of the indicators can be derived via PageSpeed ​​Insights. It shows where you should pay more attention and specifically which element, script, or style doesn’t meet the requirements. 

PageSpeed ​​Insights suggests what exactly the problem is and the developer must find a solution to this issue. 

As I understand, there are other tools besides PageSpeed Insights, like Lighthouse?

Well, Lighthouse is the same as a Page Speed ​​Insights, it’s just an extension for Google Chrome. 

I don’t like Lighthouse. I rarely use it. [I use it] only in specific situations, because it doesn’t always give the correct results, somehow skipping additional caching or vice versa. It doesn’t show real results that PageSpeed ​​insights can show.

Continuous Live Monitoring

Example – client has a website with over 200,000 pages and wants to monitor them continuously. Any solutions for that?

For starters there is the Google Search Console. It’s kind of slow to see changes, but it’s free and is built by Google.

There are also commercial tools like Sentry. But specific to us, we build our own tool for saving data on each deployment to production — before and after deployment.

Monitoring During Build Time

Is it possible to monitor Core Web Vitals at the build time i.e. when we push code to the server?

Funny that you ask,  a couple months ago our team implemented a script that does just that. [The script] monitors .git commits and checks the pages list from a special file. 

The script is automatically traversed through the entire list and gives results for each page [that show] how many points are accumulated approximately. There is a problem with Google tools itself though, because each time it gives slightly different results varying up to 10 points, and if only one “run” is done, then this does not always guarantee the correctness of the results.

Improvement

As an agency, we mainly work with WordPress-powered websites – does it make any difference in terms of Core Web Vitals?  

No. Core Web Vitals are similarly applicable for all websites. Tools are pretty much the same too, it’s just that WordPress often has some plugins to work with out of the box.

What are these tools?

I won’t go into much detail about things like scripts\CSS concatenation and minification. There is plenty of info about that already. 

Other than that it’s just a matter of caching and CDN implementation for bigger projects.

Any handy WordPress plugins?

We tried different plugins to optimize: WP-rocket, Swift, and Total Cache. 

So far, we’ve settled on WP-rocket which gives the best indicators. It is most convenient to work with, and therefore we switched to it completely. 

What’s about a CDN?

It supports it as well, it has a “meta” setting there. We don’t even use it, because even without it the results are excellent. Everything that we need is loaded onto the server. [The server contains] all scripts and styles. Everything on the servers is optimized and uploaded.

Benchmarks 

There is research on Web Vitals from Impression UK gathered in mid 2020. 

They analyzed data from the 50 most popular worldwide new sites. The data was collected through Chrome User Experience Report (CrUX) and examined the average scores for LCP, FID, and CLS. 

ResultsLCPFIDCLS
bad more than  4 sec.more than 300 ms.more than 0.25
goodunder 2.5 sec. under 100 ms.less than 0.1

Score 100/100

Is it possible to squeeze 100 points out of 100 both on a mobile and desktop and get into the green zone of Web Vitals?

Yes. It’s possible, but might cost a lot, especially for established projects with a history (read “legacy code bases”).

What I mean [is] that tackling low hanging fruits costs gets you a 75/90 score. Getting 100/100 often means rebuilding everything from scratch, which is a different kind of effort.

Infamous 20/80 rule, I guess.

Can you give some kind of development cost estimate? Let’s say you have a client with a website that is implemented with some Premium High Luxury Deluxe WordPress Theme from Envato and the client is pretty happy in general, but would like to squeeze a 100/100. 

As an agency we use in-house boilerplates instead of, say, buying ready-made themes. [We do the] same with plugins, so it’s much easier to work with later. 

Developers of ready-made themes are very fond of embedding plugins from other developers into the theme itself right at the core and changing anything there would be very problematic. [This is] pretty much impossible. 

[It is better to] spend 10 hours creating it from scratch, rather than spend 100 hours later fixing ready-made “spaghetti code.”

How much time does it usually take though?

I recently worked on a website that was pretty much optimized, but stayed in the yellow zone with a score of about 50-60. It took me about 20-30 hours to optimize.

Summary 

There’s no one quick fix for getting Core Web Vitals in line. But it’s not an impossible task, either. By following Google guidelines and building websites for users with love and care, you’ll get pretty far. And the rest is down to consistent monitoring. The numbers don’t lie.   

And skip the spaghetti code-filled themes, too. 

One More Thing…

Are there any other topics that would be cool to talk about?

It would be cool to talk about how we started using Gutenberg blocks at rgbcode.

We’ve pretty much ditched ACF (Advanced Custom Fields) and now build custom Gutenberg blocks with React.

The manager can see what and where everything is located and work on content management and modifications with ease. 

My teammate Vitaly knows about it more than anyone else, because he kickstarted the whole thing.

Until next time! 

Thanks for the query, we have 5 simple questions to allow us to come more informed to the meeting

Do you need a new development or maintenance?

What is your project about?

Do you have a design?

Do you have a specification for the project?

How did you hear about us?

Great. And the last step.

Introduce yourself, please
Preferred language
Would you like to add something more?
By clicking the button above, I agree with Terms of use and Privacy Policy.
This site uses reCAPTCHA authentication, and the Google Privacy Statement and Terms of Service apply