One of Google’s important webmaster tools is the Sitemap. A Google Sitemap is a simple XML (eXtensible Markup Language) file containing information on the relative importance, frequency of updating, and the most recent update for your webpages. The tool is used by Google to learn about pages on your site which may not be getting indexed, or which it should index more or less frequently. Google’s indexing robot is a busy machine – and if you have pages that only update once a year, it would love to know about it!
In addition to this basic use of the sitemaps tool, Google has provided an interface so that webmasters (or site owners, if they choose) can see how their search engine is currently viewing the site. The Sitemaps interface has provided information on what errors the robots encounter while browsing your site (invalid URL (Uniform Resource Locator)’s, missing pages, etc.), basic page analysis – giving you a sense for what proportion of your pages have high PageRank, and other small pieces of information.
Today, the Inside Google Sitemaps blog has announced the addition of some great new features to Google Sitemaps.
The first of these is that it now provides you access to their perception of your robots.txt file. Don’t know what this is? Well, to put it simply, a robots.txt file is a command file which resides on your server and should be the first thing a search engine robot looks for. It instructs the robot what pages and directories it is allowed to examine. This is a great tool – you don’t really want robots browsing certain files as a general rule. But there’s always been a problem with being sure that you’ve constructed it right – the only test is whether the robot gets through! With this new tool, you can easily see whether the pages you want indexed are allowed and that the pages you’ve forbidden are truly not indexed – and you can also test url’s to see whether that particular URL is allowed with your current settings.
That’s a great new development, of course! But the other two developments are even more applicable to good SEO. Google Sitemaps now provides a list of the most common terms in your site’s content and external links – great information for your keyword analysis. The last, and latest new development is very simple – Google reports to you what single page had the highest PageRank in your site for the last three months. This is a great way to start to analyze what content you’ve created that really works to bring in traffic.
These developments in Google Sitemaps are really great news – and I for one am looking forward to what we at SEO Literacy Consultants can do for your site using this service.
Have something to contribute?