How Often Should You Perform Technical Website Crawls for SEO?

How Often Should You Perform Technical Website Crawls for SEO?

Any seasoned SEO professional will tell you how important website crawls are for maintaining strong technical SEO.

But here lies the better questions – how frequently should you perform website crawls?

And how often are they actually performing them?

In this post, we’ll discuss what SEO publications suggest as a “best practice” web crawling cadence and the actual rate SEO pros are performing them.

Then, I’ll demonstrate the benefits of a ramped-up web crawling cadence by sharing a case study from FOX.com.

Using specialized tools such as Screaming Frog or DeepCrawl, you can take a look “under the hood” of a website – much like a mechanic does when inspecting cars.

But instead of inspecting the mechanical parts of a car, you are inspecting the optimizable elements of a website – including the quality of its metadata, XML sitemaps, response codes, and more.

When something isn’t working as expected in SEO, it’s up to you to diagnose the problem and find the solution to fix it.

Industry publications seem to be in agreement that “mini” technical audits should be conducted on a monthly basis and “in-depth” technical audits should be conducted on a quarterly or semi-quarterly basis.

However, there is little chatter specifically discussing the “optimal frequency” for performing website crawls:

A website crawl and technical audit are not the same thing (thank you for the clear separation of the two, Barry Adams).

However, it’s fair to assume that these publications would recommend running a website crawl at least as frequently as they run mini-audits on a monthly basis.

Best practice is one thing.

But how often do SEO pros run website crawls for client sites in actual practice?

To get an idea, I took to Twitter. (Yes, Twitter polls do have their obvious limitations – but it’s one of the simplest means to get some tangible data.)

Three days, and nearly 2,000 votes later, the results were in:

Approximately 57% of SEO pros who participated in my poll fell into the “monthly or longer” bucket, while 43% fell into the “weekly” or shorter bucket.

In other words, we were all over the map.

You may not be too surprised by these poll results.

After all, both ends of the spectrum could make complete sense – depending on the type and size of the websites you are managing.

That said, I had an experience at FOX two months ago that made me thankful that we run weekly website crawls across our major domains.

I’d like to share it with you all here – in case it encourages you to increase the cadence of your technical website crawls.

In late July 2020, the FOX SEO team ran a routine weekly website crawl of FOX.com and discovered that 100% of our TV episode pages were unexpectedly serving error status codes (due to a bug with the allRoutes.json file).

Although the pages displayed fine for users, they were throwing 404s to Google bot – making them ineligible to appear in Google search results.

Not only is “watch time” a major KPI for the site, these pages also generate significant ad revenue for the company.

Needless to say, this crawl guided us towards a serious problem.

Thanks to the crawl we diagnosed the problem quickly.

But the solution was complex.

Over the course of three weeks (July 23 – August 13), we honed in on this specific problem and confirmed a steep decline in SEO clicks and impressions from this set of pages.

The bug was fixed by mid-August and we saw clicks and impressions trending upwards back to normal numbers.

If we had waited several weeks to a month before running the crawl (which is largely considered best practice), the improper response codes would have done incremental damage to the site’s SEO traffic and ad revenue.

Not only from the problem itself, but the added time to create and execute a solution.

While it’s completely possible to have diagnosed this bug without a full website crawl, it would have made the task unnecessarily difficult.

The screenshot below illustrates that without the right Google Search Console (GSC) filters being established, this particular issue hid under a veil within the Search Results report.

The GSC Coverage report was also 10 days slow to identify errors on our watch pages and only offered examples of affected URLs.

As mentioned earlier, this response code issue did not affect the UX of the page.

Sure, you can carry on by performing simple Google Search Console “eyeballing.”

But routine crawls can be highly beneficial.

We implemented weekly crawls and haven’t regretted it since – especially after a three-week break-fix.

While it’s not necessary for SEO teams to perform weekly website crawls but you can only benefit from making a monthly crawl commitment – rather than a more relaxed, “as needed” approach.

In the words of Barry Adams:

Featured Image: Adobe Stock All screenshots taken by author, September 2020

Images Powered by Shutterstock