Google has your site on a budget. This is not just the budget that you set for your paid search ads, but this budget is one that Google controls for your organic search. Unless you are mindful of the ways that Google manages their resources and how this impacts your site, you may be squandering the organic search budget that Google allots your site. If you are dependent on search traffic from Google whether organic or paid, you need to consider how you might get more out of what is allotted to you. This may seem like a cynical view, but it is a reality.
Before any traffic can flow to your site from search, your site’s pages must be found in Google’s index. It is Google’s stated goal to index the entire world’s content. This means the search giant must continuously crawl the Web to locate new pages and revisit existing pages to ensure that the index is up to date. With billions of pages already available to index and more being created every day, the task is gigantic. Although the crawl is automated and Google’s bots are very efficient, they must be supported by extensive computing resources. Google has had to develop ways to manage its huge crawling resources. The result is that every site has a crawl budget, just how many resources Google will allocate to crawling your site. It is up to you to optimize how efficiently you use your crawl budget. There are a number of things that you may be doing that waste the crawl budget that Google allocates your site. By the way, don’t ever expect to know precisely what your actual budget is; for it is based on a series of complex mathematical formulas—an algorithm.
A small site that seldom changes poses less crawling challenges than a very large site with thousands of frequently-changing pages. Unfortunately, very large sites often sabotage their crawling efficiency and squander their crawl budget. This can have a substantial economic impact for the site owner. For a large ecommerce site, if areas are not crawled and indexed in a timely fashion, it is as if the site owner turned off the lights and signage for a part of the store.
You can obviously squander your budget by using an SEO-unfriendly product filtering systems that create duplicate content or through a clumsy implementation of a new technology such as endless scroll pages. There are other less obvious, but equally insidious ways. Several years ago, Google made available through their Webmaster Tools Sitemaps; whereby, site owners could indicate for Google what pages they wanted crawled. Today, most sites have automated the submission; however, many have taken a “set it and forget it approach.” If this has been your approach, then put a mark on your search task list to revisit your sitemaps and their performance.
Several years ago, Google announced that site speed would figure into their algorithms. It is a simple logical jump to realize that part of this calculation would include not only how fast you deliver your site to a user’s browser, but also how fast Google’s crawlers could traverse your site. If you focused on this briefly and then put it aside as finished, revisit it now. Just how fast is your site? If you use a CDN to speed your site to users, do not assume that you have optimized your delivery for robots. Robots such as Googlebot must be handled as a separate type of user. Any changes made to your technology or architecture should trigger a review of site speed performance for users and robots. If you optimize performance to ensure that you do not waste Google’s crawling resources, you just may find that your site is fully indexed and will most probably rank higher in the search results.