Close
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
Read Down
Sign in
Close
Welcome!Log into your account
Forgot your password?
Read Down
Password recovery
Recover your password
Close
Search
Logo
Subscribe
Logo
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
More
    Subscribe
    Home Cloud
    • Cloud
    • Search Engines

    Google Offers Advice on Optimizing Sites for Web Crawling, Indexing

    Written by

    Jaikumar Vijayan
    Published January 20, 2017
    Share
    Facebook
    Twitter
    Linkedin

      eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

      Google offered a tutorial of sorts for webmasters trying to figure out how to optimize their sites for web crawling and indexing by Google’s search engine bot.

      Web crawling, as Gary Illyes, a member of Google’s crawling and indexing teams explained in a blog this week, is the entry point for sites into Google’s search results.

      “Efficient crawling of a website helps with its indexing in Google Search,” he noted.

      Googlebot, the technology that Google uses to crawl websites for new and updated content to add to its search index, relies on a sophisticated set of algorithms to determine which sites to crawl, how frequently and how many pages to look at during each visit.

      The number of URLs on a website that Googlebot can and wants to crawl, or the crawl budget for a website, depends on a couple of factors. One of them is what Google calls the crawl limit and the other is crawl demand.

      Googlebot sets so-called crawl rate limits for each site to ensure that its crawling activity does not degrade site performance. The crawl rate limits depend on factors like how quickly a website responds to Googlebot requests.

      For example, if a site slows down or responds with server errors, Googlebot will crawl the site less, Illyes said. Webmasters also have the ability to limit the extent to which Googlebot crawls their site.

      Crawl demand refers to overall Googlebot activity on a site and is determined by the relative popularity of the site and the freshness of its content. Major site-wide events like site moves, can also trigger an increase in crawl demand so content can be reindexed under the new URLs, Illyes explained.

      For most website owners and webmasters, crawl budgets are not something they need to worry about, Illyes noted. For example, if Googlebot tends to crawl new pages on their website the same day they are published, crawl budget is not something they need to be focusing on, he said. Similarly, owners of sites with fewer than a few thousand URLs have little to worry about Googlebot not crawling their sites efficiently.

      “Prioritizing what to crawl, when, and how much resource the server hosting the site can allocate to crawling is more important for bigger sites, or those that auto-generate pages based on URL parameters, for example,” Illyes said.

      Google’s analysis has shown that such sites can benefit from ensuring they don’t have too many low-value add URLs on their sites, he noted.

      For example, sites that offer faceted navigation—like allowing users to filter by price range or color—can often create many combinations of URLs with duplicative content.

      When search engines crawl identical content through varied URLs, there can be several negative consequences, according to Google. Having multiple URLs pointing to the same content can dilute link popularity and result in the wrong links being presented in search results.

      Similarly, websites that do a poor job of reporting non-existent pages and other page errors can degrade Googlebot’s crawl coverage and can sometimes result in the best content being overlooked.

      Others factors that can impact Googlebot’s crawl coverage are hacked pages, low quality and spam content and so-called infinite spaces of very large numbers of links that point to little or no new content for indexing purpose, Illyes said.

      “Wasting server resources on pages like these will drain crawl activity from pages that do actually have value, which may cause a significant delay in discovering great content on a site,” he noted.

      Jaikumar Vijayan
      Jaikumar Vijayan
      Vijayan is an award-winning independent journalist and tech content creation specialist covering data security and privacy, business intelligence, big data and data analytics.

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      MOST POPULAR ARTICLES

      Artificial Intelligence

      9 Best AI 3D Generators You Need...

      Sam Rinko - June 25, 2024 0
      AI 3D Generators are powerful tools for many different industries. Discover the best AI 3D Generators, and learn which is best for your specific use case.
      Read more
      Cloud

      RingCentral Expands Its Collaboration Platform

      Zeus Kerravala - November 22, 2023 0
      RingCentral adds AI-enabled contact center and hybrid event products to its suite of collaboration services.
      Read more
      Artificial Intelligence

      8 Best AI Data Analytics Software &...

      Aminu Abdullahi - January 18, 2024 0
      Learn the top AI data analytics software to use. Compare AI data analytics solutions & features to make the best choice for your business.
      Read more
      Latest News

      Zeus Kerravala on Networking: Multicloud, 5G, and...

      James Maguire - December 16, 2022 0
      I spoke with Zeus Kerravala, industry analyst at ZK Research, about the rapid changes in enterprise networking, as tech advances and digital transformation prompt...
      Read more
      Video

      Datadog President Amit Agarwal on Trends in...

      James Maguire - November 11, 2022 0
      I spoke with Amit Agarwal, President of Datadog, about infrastructure observability, from current trends to key challenges to the future of this rapidly growing...
      Read more
      Logo

      eWeek has the latest technology news and analysis, buying guides, and product reviews for IT professionals and technology buyers. The site’s focus is on innovative solutions and covering in-depth technical content. eWeek stays on the cutting edge of technology news and IT trends through interviews and expert analysis. Gain insight from top innovators and thought leaders in the fields of IT, business, enterprise software, startups, and more.

      Facebook
      Linkedin
      RSS
      Twitter
      Youtube

      Advertisers

      Advertise with TechnologyAdvice on eWeek and our other IT-focused platforms.

      Advertise with Us

      Menu

      • About eWeek
      • Subscribe to our Newsletter
      • Latest News

      Our Brands

      • Privacy Policy
      • Terms
      • About
      • Contact
      • Advertise
      • Sitemap
      • California – Do Not Sell My Information

      Property of TechnologyAdvice.
      © 2024 TechnologyAdvice. All Rights Reserved

      Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.