All web optimization paths result in Google’s Core Net Vitals: Questions and Solutions with Trainline

Almost everyone agrees on the theory of web performance optimization, but what does it take to put the theory into practice?

With the new Google Core Web Vitals update expected to be released in the next month, using the offering to evaluate real-world web user experience is a huge benefit when the time comes to track your website’s SEO performance.

The Core Web Vitals tool shows the performance of your pages based on real-world usage data, which monitors a range of metrics related to speed, responsiveness, and visual stability to help website owners measure user experience on the web.

Just before Core Web Vitals’ ranking signal tool goes online, Carl Anderson, Director of Engineering at Trainline, sat down with TechRadar Pro (virtual) to discuss how his team has been working towards the launch.

Why is it important to measure web performance by ranking for customers?

Of course, relevance remains the most important factor in the page ranking. However, now that your website’s web performance is also taken into account via load time, interactivity, and visual stability, it’s important to tweak these to increase your potential, reach more customers, and provide a better user experience for more people.

How did your team work towards the Core Web Vitals as a ranking signal and why is it important that they be set up?

We have always focused on optimizing our website for the user experience. However, the introduction of Core Web Vitals has given us the opportunity to leverage our investments to ensure we are always one step ahead when it comes into effect. I lead the front-end teams at Trainline, but it’s been a cross-functional effort – working closely with the back-end to make sure we’re optimizing performance at all levels.

We have been working on baseline as this is the most important step. From there we built on that and overlaid the back end with APIs and then the website – the user experience is a summary of all of these factors. You can use a baseline to analyze your performance. On this basis you can create your hypotheses.

For example, we found that we can optimize communication with our data platform by doing HTTP connection pooling in our web application, which has saved us seconds in the entire booking flow. Measurement was then key to ensuring that we could learn from our approach and iterate on what we had built.

Creating the best user experience depends on several factors including perceived load time, interactivity, visual stability, and balancing them so that the site behaves and loads as the user would expect. The bottom line is that there isn’t a single speed metric that a team should be aligning themselves with because, in our experience, optimizing on one often means compromising on another, which ultimately affects the user experience.

We have aligned our approach with the priorities of the user. For example, on our home page, we’ve focused on making the Travel Search widget as interactive as possible so customers don’t have to wait for other items to load before they can start entering their query. Search is a key component of the Trainline user experience, which is why we focused on it.

SEO analysis

(Photo credit: Pixabay)

How has your journey been to optimize web performance so far?

We have been on the road to optimizing our web performance for the customer experience for a long time, but it was great to see that our approach was confirmed with the introduction of Core Web Vitals.

It’s an ongoing process because the more we build, the more we need to reevaluate and adjust to optimize our performance. We are constantly building on the product with over 300 releases per week while preventing this from affecting performance or making it even faster. Measurement was a key factor throughout the trip. It ensures that we are continuously collecting data in a consistent and reliable manner so that we can see how performance is developing and how we can improve.

How did your team learn to separate actionable insights from reporting noise?

It’s about getting the right metrics in place – in the beginning we measured everything, but the breakthrough came when we started correlating our metrics with both our business metrics and our web deployments: we could see that some releases had negligible impact While others were slowing down Decrease or speed up the performance so we can tweak the optimization for the user experience. By linking it to our business metrics, we were then able to prioritize our actions.

Second, it’s about focusing on the right areas. For example, it is instinctive to solve the problems that affect the customers who are getting the slowest experience. However, to get the most impact, you need to research what affects the majority of your users. That way, you can make a more significant difference to a larger number of your customers.

In conclusion, I want to point out that web performance averages can be very misleading as they don’t give you a clear picture and can mask the extent of a problem affecting your speed. Instead, the use of percentiles to focus on specific groups of users turned out to be instrumental in this journey.

How can the needle be moved in relation to weaving performance?

We can optimize through what we measure. The combination of synthetic measurements and Real User Monitoring (RUM) is the key. The synthetic measurements allow us to compare improvements in exactly the same test environments under the same conditions, which means we can compare the performance of any version of our code.

We would then test these potential improvements in the field and record user data that provides insights into their experience, as this is really important – and profits in a lab setting don’t always translate into profits on the ground.

Ultimately, we focused on where we can make the most difference for the majority of customers in order to have the greatest overall impact. Moving the needle comes down to a combination of smart measurements that will guide you to improve the user experience.

Comments are closed.