Before you start: if you’re unfamiliar with the principles of statistical SEO split-testing and how SplitSignal works, we’re suggesting you start here or request a demo of SplitSignal.
This time the majority of our followers were WRONG! Read further to find out why.
But first, let’s see what SEO professionals had to say about the result of this test:
Follow us on LinkedIn to share your thoughts on the next test.
One of the main ways people determine which search results may be relevant to their search query is by looking at the titles of web pages displayed in the search results. In addition to being an important place to include relevant keywords to facilitate ranking, page titles can also persuade users to click on your search result rather than any of the other results you compete with. Finding the most optimal page title often requires a lot of testing and trade-offs between including relevant keywords and having search results that grab users' attention to improve click-through rates (CTR).
The benefit of fresh content varies by search query, as different searches have different freshness needs. Looking at the Google Search Quality Rating Guidelines (October 19, 2021), we know that for the product or service-related queries, users usually want up-to-date content, assuming they're seeking information about future services for things like ferry tickets.
Over the year, we ran several tests with freshness signals in the page title, all of which had different results, e.g., for a furniture retailer, adding "Updated 2022" to the beginning of their page titles.
The website in question is not the one illustrated on the examples below!
This test resulted in a 4.9% increase in organic clicks to the tested pages.
We tried the same test setup for a large e-commerce website
This test had no significant impact on the pages tested.
By running hundreds of split tests, we know that something that works for one website may not work for another. Therefore, we wanted to validate whether content freshness signals were a factor (for Google and/or users) for detail pages of a major travel website.
The website in question wanted to test if adding the current month and year to the page titles of their detail pages would have a significant impact on organic traffic.
They hypothesized that this change would allow them to stand out more from the other search results. By demonstrating the freshness of the content and up-to-date listings, we wanted to convince the users to go to the website.
They added the current month and year to the end of the page title with the expectation that this would have a positive effect on the organic traffic to the tested pages.
We used SplitSignal to make the page title change for the tested pages. In total, about 900 detail pages were selected as either control or variant. We started the test and ran it for 16 days. We found that Googlebot visited 73% of the pages tested.
For this particular website adding the current month and year to the page title resulted in a negative effect of -6.6% on organic clicks to the tested pages.
After 10 days. As the test progressed and more and more test pages were picked up by Google, we could see that the test was heading for a negative result.
After 14 days, we were able to determine that the effect we saw was significant. When the blue shaded area performs below or above the x=0 axis, the test is statistically significant at the 95% level. This means that we can be confident that the effect we see is due to the change we made and not to other (external) factors.
Note that we are not comparing the actual control group pages to our tested pages. but rather a forecast based on historical data. The model predicts the counterfactual response that would have occurred had no intervention taken place. We compare this with the actual data. We use a set of control pages to give the model context for trends and external influences. If something else changes during our test (e.g., seasonality), the model will detect and take it into account. By filtering these external factors, we gain insight into the true impact of an SEO change.
As mentioned at the beginning, we have seen content freshness tests with different results. This is a test that influences user behavior. In this case, the change made the result stand out, but the users didn't click on the search results more often than expected.
Data analysis shows that this test affected the click-through rate (CTR) on the pages tested. In this case, the rankings and impressions remained stable compared to our modeled control group. The decrease in clicks seems to be purely due to the behavior of Google users.
Knowing what users expect and what users find appealing can make a big difference in how users interact with your search results snippets. For this test, we now know that this particular content freshness signal does not match user expectations. Optimizing page titles isn't just about including your main keyword(s). There is much more to writing appealing and effective page titles.
With split testing, you want to ensure that the changes you make will yield positive results. We want to prevent organizations from wasting time, money, and energy on changes that ultimately make no or only a limited positive contribution. Testing allows you to move quickly and implement proven positive changes.
Finding ways to stand out and be the most relevant answer to a search query is essential for optimal organic performance. Keep in mind that something that works for one website may not work for another. The only way to know for sure is to test what works for you!