Google announced recently that webmasters should allow bots to crawl their site's javascript, CSS, Images and other formatting to get better index rankings.

In a recent update to Official Google webmasters blog, it has been announced that webmasters should allow bots to access Javascript, CSS and other image files in webpages to help better indexing and rendering of their sites. Google bots are able to render a webpage like a modern web-browser does which renders all content including Javascript, CSS and Image files hence site indexing is not only dependent on textual content now. Along with following other SEO guidelines, you must allow Google crawlers to render all content of your site.  Disallowing Google bots to crawl these files in robots.txt file can have poor impact on search rankings. It directly harms the proper rendering and indexing of your webpage by Google search algorithms specially Google panda will have a negative effect if this addition to Google technical webmaster guidelines is not followed.

What's New In This Update:

Formerly, Google bots were able to render a site just like old text only browsers which couldn't render images and other advanced web design languages. An example of this type of browsers is Lynx browser. A few months ago, "Fetch and Render" tool was introduced in "Google Webmaster Tools" which can render the site just like modern web browsers like Firefox and Chrome. As you know that these web-browsers not only deliver text content present in the webpage but they also interpret Javascript, CSS and images that are used in same webpage. It makes pages more understandable and viewable by humans. Now crawlers also crawl the site like these web-browser instead of viewing and indexing text only. This update will have a good impact in the practice to make search bots more efficient because text-only rendering is no more better than whole page rendering.

How To Check If JavaScript And CSS Indexing is Allowed In Your Site:

If you are not sure about indexing of Jacascript, CSS and other formatting in your website then you can confirm it by using Fetch And Render As Google in webmaster tools.

Google-page-render-webmaster-tools

To perform this test, select Fetch option under Crawl tools and then enter the URL you want to check. Leave blank to render homepage. Press Fetch and Render button so Google bots will start crawling your webpage. After the crawling is complete, click on Submit to index and you will see the results which may be complete, partial, redirected etc according to test. If it says Complete, then nothing to do because whole content is indexed. Partial response, that shows that some of the content wasn't allowed to be indexed, might be confusing sometimes. Well it could be due to usage of third party scripts on which you don't have any control. Here is the complete list and description of Fetch as Google responses.

You can check blocked scripts, files and stylesheets by clicking on result. It will take some times and then display actual rendering along with a list of blocked resources which will help you determine what are you blocking intentionally and what is not in your hand.

fetch-as-google, render-tool, webmaster-tools, google-tool

What Has Been Changed:

Previously, webmasters were advised to check their websites on Text-only browsers like Lynx to check if they are using right on page SEO techniques because focus was more towards text rather than other formatting.

After this update, this advice has changed to check all content of the site in addition to textual content.

Tips To Optimize Indexing Of Website:

Google is advising to follow below tips for optimal indexing of the site.
  • Make sure that your page uses most common technologies which support wide range of popular browsers.
  • Pages that render quickly are indexed efficiently so improve your site's speed by eliminating unnecessary downloads and other speed improving factors.
  • Optimize the serving of your CSS and JavaScript files by merging your separate CSS and JavaScript files, minifying the CSS and Javascript files, and configuring your web server to serve them compressed (usually gzip compression).
  • As handling of full scripts and CSS can be challenging for your servers in certain conditions so make it sure that they can handle further load for rendering of Javascript and CSS by Googlebots.


Jariullah

About Author

Jari Ullah is professional blogger, freelance writer and owner of "HelpITx" blog. He is fond of blogging and loves to learn and share blogging tactics.

Post A Comment:

0 comments: