Google Advising Web Developers on JavaScript Page Indexing Issues

Web developers should double-check to ensure that JavaScript code in their Web pages isn't negatively affecting page rankings in Google Search results.

Web Developers Need to Check JavaScript Code

Google is advising Web developers to double-check the performance of their Web pages after Google made some recent changes to its page indexing and page rendering systems involving JavaScript.

The issue was raised by Google software engineers Erik Hendriks and Michael Xu and Google Webmaster trends analyst Kazushi Nagayama in a May 23 post on the Google Webmaster Central Blog.

"In 1998 when our servers were running in Susan Wojcicki's garage, we didn't really have to worry about JavaScript or CSS," wrote Hendriks, Xu and Nagayama. "They weren't used much, or, JavaScript was used to make page elements ... blink! A lot has changed since then. The web is full of rich, dynamic, amazing websites that make heavy use of JavaScript."

In the past, Google Search has been "only looking at the raw textual content that we'd get in the HTTP response body and didn't really interpret what a typical browser running JavaScript would see," they wrote. "When pages that have valuable content rendered by JavaScript started showing up, we weren't able to let searchers know about it, which is a sad outcome for both searchers and webmasters."

In response, Google Search began to try to understand pages by executing JavaScript, they wrote. "It's hard to do that at the scale of the current Web, but we decided that it's worth it. We have been gradually improving how we do this for some time. In the past few months, our indexing system has been rendering a substantial number of web pages more like an average user's browser with JavaScript turned on."

That step, however, can affect how pages are then seen by Google's search engine, they wrote.

"Sometimes things don't go perfectly during rendering, which may negatively impact search results for your site," they wrote. Several issues have come up that need to be analyzed by Webmasters so that their page rankings are not harmed.

One of the problems, wrote Hendriks, Xu and Nagayama, is that if resources like JavaScript or CSS in separate files are blocked (with robots.txt or similar means) so that Googlebot can't retrieve them, Google's indexing systems won't be able to see the site like an average user would see it.

"We recommend allowing Googlebot to retrieve JavaScript and CSS so that your content can be indexed better," they wrote. "This is especially important for mobile websites, where external resources like CSS and JavaScript help our algorithms understand that the pages are optimized for mobile."

In addition, if a business Web server "is unable to handle the volume of crawl requests for resources, it may have a negative impact on our capability to render your pages," they wrote. "If you'd like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources."