
Google favours websites in its search engine results that it considers technically high-quality.
A website serves no purpose without appropriate content. Hence, search engine optimisation has traditionally focused on the textual content of a site.
However, in 2020 Google announced it would begin to emphasise the user experience of a website in its search results. It justified this decision by stating that users prefer sites that load quickly and work well on mobile devices. Consequently, Google has started measuring the performance efficiency of a website when assessing its relevance as a search result.
The update focusing on the technical quality of websites has been in production since August. In the midst of this reform, one thing remains unchanged: Google is a monopoly, and the opinions of other search engines have no practical significance for optimisation.
Therefore, at the very latest today, it is advisable to complete the homework that king Google has assigned.
Google openly explains how it measures the technical quality of your website
It is in Google's interest that the internet is full of high-quality content that users of the search engine can find appealing. That's why Google openly explains how it evaluates the technical quality of your content and provides personalised suggestions for improvements via the PageSpeed Insights test, which is part of the broader Lighthouse test.
The general practical rule of thumb is that a technically simpler user interface implementation is better than a complex one.
How can you test Google's perception of your website's technical quality?
Before delving into the metrics used by Google, it's a good moment to have PageSpeed Insights or the broader Lighthouse test running in the background.
If you wish to conduct a Lighthouse test on your Chrome browser, open the developer tools on your webpage, select the “Lighthouse” tab, and press “Generate report”. Note that the results of a Lighthouse test conducted on your own browser depend on your workstation's performance. Also, disable ad blockers to see the truth.
The core of technical quality assessment lies in Core Web Vitals
Core Web Vitals is Google's term for three computational metrics of the site that visibly affect the user's experience.
First, I list the conventional methods and tricks that I presume are already in use by default:
The server is not a presumed potato found in a Kyrgyz root cellar, nor is the potato looped with unindexed database queries or other deadly programming sins.
Sensible caching solutions are utilised for serving the document and the media files it contains.
CSS and JavaScript are minified.
Animations are executed with CSS and not with jQuery or similar inventions, as was necessary two decades ago.
Your site does not mine cryptocurrencies on the end user's terminal.
Next, I attempt to explain as clearly as possible what the Core Web Vitals measures are and what usually ruins them. Even though PageSpeed Insight provides direct suggestions such as “reduce unused JavaScript”, “remove render-blocking resources,” and “minimise main thread work,” understanding what is meant by these and ways to address the issues can sometimes be elusive.
LCP: Largest Contentful Paint
LCP is a metric measuring the loading speed of a site. It immediately looks for the content element visible on the end user's terminal, which appears the largest. Hence, the LCP metric measures the time it takes from opening the site to displaying the largest content element upon the first scroll. The target time is 2.5 seconds or less.
LCP considers the following elements:
<img>
<image> within <svg> element
<video>
Elements with a background image
Any block level element containing text directly or in inline elements.
The most common pitfalls:
A carousel of Vantaa-sized proportions with 26 images that are not lazy loaded.
A Titanic and Ben Hur mash-up as the site’s hero video.
A stock photo of 9001 x 9001 pixels featuring joyous hand clappers in a meeting room displayed on a 400 x 800 screen.
How to fix LCP?
Use lazy loading for images that are not visible first.
Load video content only after user interaction and carefully consider whether a multi-thousand-euro drone-shot video of your premises is truly a sensible investment.
Serve an image suitable for the screen size, for instance, using a <picture> element.
FID: First Input Delay
FID is a metric measuring the site's interactivity. The input delay refers to the time from the user's interaction until the browser is able to start responding. In practical terms, input delay occurs because the browser has rendered an element but is not ready to process interaction with it. The target time is 0.1 seconds or less.
In almost every case, this is due to the browser being busy processing megabytes of JavaScript, thereby not promptly reacting to user input.
The most common pitfalls:
JavaScript used on the site is given to the browser for processing regardless of whether it’s needed for that particular page load.
Native browser features are reinvented with custom JavaScript.
A third-party chat service is loaded on the site without any user interaction or other indication that it is needed.
Third-party bloated scripts are loaded synchronously, with your own site patiently waiting its turn.
How to fix FID?
Dynamically include JavaScript on the site natively or for instance using Webpack.
This is almost always the single action that most improves the site's performance efficiency.
Consider whether your site's JavaScript should generally be put on a slimming regime.
Use native browser features. Give them new interfaces with as little weight as possible.
Carefully consider whether the site's chat is an essential sales tool needing automatic inclusion on every page load. Endeavour to have the end user make their own choice about chat usage and only then load the chat script.
If you need third-party scripts, include them early in the HTML file but use the async or defer attribute.
This is also a good way to accidentally ruin your site's analytics. Load analytics and other user tracking synchronously at the beginning of the HTML file.
If using Tag Manager or a similar product, include the script immediately, leveraging the aforementioned attributes.
CLS: Cumulative Layout Shift
CLS is a metric that measures the visual stability of the site. Layout Shift occurs when elements like text, images, and buttons move from their original place as the site loads. The more elements move, the higher the CLS score. The goal is 0.1 or less. Since CLS is predictable and consistent, I do not find it sensible here to explain how the CLS score is calculated, but you might want to read Google's description of it. The challenge is more about getting into the habit of considering CLS before Google gives you a hard time about it.
The most common pitfalls:
No space is reserved in advance for the area an image requires.
Elements are only added to the site once their content has been fetched asynchronously from elsewhere.
How to fix CLS?
Reserve space in advance for images either by stating the display size of the image directly in the element or using CSS
Include elements reliant on asynchronous content already server-side, and reserve the appropriate space for the fetched data.
Debug CLS with Chrome’s developer tools.
When you set Chrome to display shifts in layout elements, they flash blue during page loading.
Google uses other metrics besides Core Web Vitals to judge your site's performance efficiency
FCP: First Contentful Paint
FCP measures how long it takes to paint the first element of the site on the display. The target time is less than 1.8 seconds. The most common pitfall here is the slow loading of fonts from a third-party service. By using the CSS rule font-display: swap; you tell the browser that it can use a system font while waiting for the final font to load.
Effectively serving fonts is generally not very easy. Tips for optimising the load speed of fonts served from Google Fonts.
TTI: Time to Interactive
TTI measures how quickly the input delay is less than 0.05 seconds. The target time is less than 3.8 seconds. As executable JavaScript decreases, TTI improves.
Speed Index
The Speed Index measures how quickly content becomes visible. By optimising FCP and TTI, you have optimised the Speed Index. The target time is less than 3.4 seconds.
TBT: Total Blocking Time
TBT measures how long the site altogether cannot receive user inputs. This includes summing up all the time the site spends on processing an input that takes longer than 0.05 seconds. The goal is less than 0.2 seconds. TBT is another metric you improve by optimising and reducing JavaScript.
PageSpeed Insight is not the only tool Google uses for quality assessment
As I have mentioned earlier, PageSpeed Insight is just part of the broader Lighthouse testing. Lighthouse assesses not only site performance efficiency but also accessibility, best practices, i.e., practically security and fidelity, and search engine friendliness. Optimising a site’s performance efficiency can naturally be bundled into broader updating work from a search engine perspective.
Optimising site performance efficiency might come at a cost to other things
So why don't all sites built by Crasman score full marks on Google's performance assessments? Simply because it is not what our clients purchase from us.
The platform your site uses might influence the results
Common website management systems like WordPress, Magento, and HubSpot include JavaScript on the site that we as a supplier can't do anything about. Crasman Stage does not itself include JavaScript on the website; it is entirely under the developer’s control – except for possible Tag Manager additions.
Ultimately, it’s about priorities, a golden mean, and efficient use of money
The inherent faults related to site performance efficiency in modern web development have not arisen spontaneously, but from efforts to meet the needs of website commissioners and end users.
The budget for developing a comprehensive site project remains within the limits of the human mind's understanding and tolerance only by utilising pre-existing open source code. Chat tools have been seen as beneficial as support for customer service and sales work. The benefits have been perceived to outweigh the drawbacks.
The visual impact of websites is a way to emphasise one's brand and make a favourable difference to competitors. Advertising done on social media is extremely effective, but requires various tracking scripts and pixels to function on your site.
It is thus good to remember that search engine visibility is only a small part of the whole that constitutes a website, and from that the site’s performance efficiency is again a small part of.
Ultimately, the most important thing in all online business is to serve the customer as well as possible by offering them content they want. It is also not meaningful to conduct business online if the costs become so high that the commerce it brings does not cover them.
Crasman Ltd
25 Nov 2021


