Major infrastructure changes may drop index performance

Google’s search index is a platform where the candidate pages qualify and provide the results which is why the indexing score of Google is so crucial. On that account, John Mueller of Google describes why and how big changes to a website’s basic infrastructure may cause Google to significantly slow down web page crawling. Responding to a query about decreased crawl rates following major changes to a website’s architecture, such as switching to a new content delivery network (CDN).

The process of Googlebot, Google’s web crawler, visiting a web page and downloading it to Google’s search index is referred to as indexing. Now, the suitable pages for rating can be found in Google’s search index and this search index provides the results. That is why Google’s indexing rate is so essential. Google must crawl and discover new pages in order to rank them. If something prevents Google from crawling a web page, it is considered a serious issue. 

Google’s Search Advocate Mueller explains why indexing may appear to stutter following such infrastructural adjustments to the website. The site owner who raised the query stated that they set URL redirects and switched to a new CDN and that indexing rates began to decline right after the changes.

The website owner asked: “After redirecting and changing and CDN we’re seeing a drastic drop in crawl rate.” 

“Yes! That’s very reasonable. If you change your website’s infrastructure then we will change our crawling. On the one hand, first, to be a little bit conservative and make sure we don’t cause problems and then later on we automatically ramp up again. 

So if you change to a different CDN that’s a significant change in the infrastructure and we recognize that change and we hold off crawling for a while and then we ramp up again if we think everything is fast,” John Mueller suggests. 

A content delivery network (CDN) is an international network of servers. The goal is to provide website material faster by delivering it from a server that is close to the person who is trying to access the page. Mueller explains why indexing may appear to be declining following an adjustment like this.  

“We use software known as web crawlers to discover publicly available web pages. Crawlers look at web pages and follow links on those pages, much like you would if you were browsing content on the web. They go from link to link and bring data about those webpages back to Google’s servers.…We take note of key signals — from keywords to website freshness — and we keep track of it all in the Search index,” Mueller concluded. 


Discover more from Rudra Kasturi

Subscribe to get the latest posts sent to your email.

Leave a Reply