Originally seen at: https://varvy.com/spiderview.html
Varvy is now Down and this Content is for Archive Purposes.

Help Google understand your site


“To help Google fully understand your site’s contents, allow all of your site’s assets, such as CSS and JavaScript files, to be crawled.”

– from the Google webmaster guidelines 1

If Google can not understand your page, it can not rank you

Google needs a complete picture of your webpages in order to understand it fully.

Test your site for blocked resources using the Google guidelines tool.


Google uses a web crawler named Googlebot to gather information about your website.

Every webmaster should know that a search engine crawler like Googlebot must be able to “crawl” your site in order for it to be included in search engine results.

The way search engine crawlers visit your webpages is determined by a file called robots.txt.

Page Resources

Most webpages use CSS and / or javascript. These are often external files that are linked to from your HTML.

Google must have access to these resources in order to fully understand your webpage, but often these files are blocked by the robots.txt file.

How to check if your site is following this guideline

Use the Google guidelines tool to see what files (if any) are blocked from Googlebot.

Key Concepts:

Make sure that search engine spiders are able to see your site correctly.

Ensuring that your website is seen correctly by search engine spiders is vital.

Leave a Comment