Practical Ecommerce

Guide to Google Webmaster Tools

Valuable search-engine-optimization tools that provide unique data tend to be expensive. Tools with limited data sets or limited capabilities tend to be free. Google Webmaster Tools bucks that trend by offering — for free — a unique data set and features that can’t be found in any other tool.

I’ve previously addressed reasons to register for Google Webmaster Tools, in “Top 5 Reasons to Use Google Webmaster Tools.” The purpose of this article is to explain how to use this amazing free tool set to improve your site’s search engine optimization.

Getting Set Up

Zoom Enlarge This Image

Adding Sites to Google Webmaster Tools.

Adding Sites to Google Webmaster Tools.

Before Google opens the doors to its treasure chest, you have to prove you own the site by going through a verification process. If you don’t already have a Google account to access Webmaster Tools, create one. Then simply click on “Add a Site” on the Webmaster Tools home page to start the process. Every domain, subdomain and TLD — top level domain, the part after the period, such as .com, .net, .org — needs to be added and verified individually. For example, practicalecommerce.com, www.practicalecommerce.com, and developer.practicalecommerce.com are all separate sites in Google Webmaster’s eyes, regardless of whether they contain separate content.

Once the sites are added, the sites can be verified several different ways. Click the “Verify” button next to each site, choose a verification method, and follow the directions. Meta tag and file upload are the two most common methods, and both require access to post or modify files at the root of the site.

Note that just “adding a site” gives zero access to data. The verification step is critical to actually using Webmaster Tools.

Tips and Tricks

The first thing to remember is that everything you see is temporary and most data is shown on a rolling 30-day basis. You can change date ranges on some charts, but the range only allows about 35 days. Consequently, you cannot compare data month over month or year over year — or compare data from before and after an algorithm change, for example — unless you’re regularly saving the data in Google Webmaster Tools to your own hard drive. For my SEO clients, I do this monthly in tandem with my regular monthly reporting.

Most charts have a download button, but some charts have two. Buttons that say “Download this Table” will download all of the available data for that report in a CSV format. Buttons that say “Download Chart Data” will download only the data shown in the chart. Take a screengrab of charts that don’t allow downloads, so at least the visual trends will be available for comparison even if the data can’t be manipulated.

Google developed its Webmaster Tools to use the same navigational cues as a regular website. If something is blue, you can click it and get more detail. Mouse over charts to see individual data points. Click the question marks for definitions and help.

Google Webmaster Tools Dashboard

Zoom Enlarge This Image

Google Webmaster Tools' dashboard for Practical Ecommerce.

Google Webmaster Tools’ dashboard for Practical Ecommerce.

Once verified and logged in, Webmaster Tools’ home page features a dashboard of messages, crawl, traffic, and sitemap data. Pay careful attention to the messages section. If Google has detected malware, suspicious activity, a lot of new redirects or other types of behavior consistent with a site that has been hacked or is violating Webmaster guidelines, Google may warn you in the messages section. If the warning is expected, such as an alert about high numbers of redirects when you just changed URLs on a section of the site, then don’t worry about the message. If it’s unexpected, however, definitely look into the root cause of the message.

Google also places “Crawl Errors” on the dashboard. Google reports five different kinds of URL errors and three types of site errors. Don’t get too hung up on a few errors — there are often good reasons for pages to give a 404 error and occasional 500 errors are commonplace. However, if error numbers spike unexpectedly, investigate the cause with your IT team.

The “Search Queries” report is one of the biggest reasons to register for Google Webmaster Tools. The dashboard offers a snapshot of the report, showing organic queries, rankings, impressions and clicks over the last 30 days. Clicking through the snapshot will reveal a more detailed report. This is a fantastic report and will likely be your first stop on a regular basis.

Lastly, the dashboard displays some basic stats about XML sitemaps, including the number of URLs submitted via the XML sitemaps, the number of errors found at those URLs and the number of indexed pages. This section will be blank if you have not submitted an XML sitemap.

Configuration Tools Menu

The “Configuration” menu offers some basic site management settings, such as URL settings and user privileges. The “Settings” page lets users set the site’s geographic target, preferred domain and crawl rate. The “Sitelinks” page allows users to request removal of sitelinks, the links displayed in the search results under a site’s primary result, from certain pages. The “URL Parameters” page contains an interface to request that specific parameters, such as tracking parameters, be ignored by search engines. If a site is moving to a new domain, the “Change of Address” page offers information on how best to accomplish this and a handy form to indicate to where the site moved.

Site owners can also manage user privileges from the “Configuration” menu. The “Users” page allows site owners to add new users to Google Webmaster Tools and designate users as having restricted or full access. Think of restricted access as read-only access. Restricted users can see the data but can’t change settings that could potentially harm a site accidentally. Site owners can also assign “Associates,” users with permission to access other Google properties like YouTube, from Google Webmaster Tools.

Health Tools Menu

Google’s statistics on your site’s health can be found in the “Health” menu. The crawl error and malware reports were already discussed in the dashboard section. The “Health” menu also contains a “Crawl Stats” page that shows charts of the number of pages crawled and kilobytes of data downloaded per day, as well as the amount of time spent downloading pages. Watch these charts for unexpected spikes or valleys that may indicate server load issues or the discovery of new sections of content.

The “Blocked URLs” page is a wealth of information regarding the site’s robots.txt file and the pages it allows and forbids crawlers to access. The first section shows the number of pages blocked by the robots.txt file. This should be watched for sudden increase or decrease. Below that, however, is a great tool for editing and testing the robots.txt file in a sandbox state before pushing it live. Make any edits desired in the first box to disallow or allow files or folders from being crawled, and then enter URLs to test against the robots.txt file in the second box. The tool will check the URLs entered against the commands in the robots.txt file and report whether each was allowed or disallowed, and which line in the robots.txt file triggered the allow or disallow. Before posting any changes to a robots.txt file, it’s an excellent idea to test the file with this tool first.

Google also allows users to “Fetch as Googlebot,” meaning that any URL pasted into the form will be crawled by the specified bot and displayed in this tool. This tool is useful when checking whether the site is treating users differently than search engine crawlers, and determining what the crawlers can index. It’s also useful to trigger a crawl or request indexation of individual pages, such as when a critical new page of content or redirect has been placed on a page and you want Google to discover it immediately.

Zoom Enlarge This Image

Google Webmaster Tools "Index Status” page.

Google Webmaster Tools "Index Status” page.

The “Index Status” page charts a year’s worth of indexation data. The blue line indicates the total number of indexed pages, compared to the red line’s number of pages crawled. The green line shows pages that redirect or that the site owner has indicated should not be indexed. The orange line indicates the number of pages blocked by the robots.txt file. Logically, most sites will see a gradual upward trend as more pages of content or products are added. However, it may be desirable to see a decrease in pages indexed at times as well, especially when the site owner is actively focusing on removing duplicate content from the index.

Traffic Tools Menu

The “Traffic” menu contains the most valuable collection of Webmaster Tools that Google offers, including data on rankings, impressions, visits, internal and external links, and +1 activity. I addressed the “Search Queries” report in the dashboard section above.

Zoom Enlarge This Image

Google Webmaster Tools “Links to Your Site” report.

Google Webmaster Tools “Links to Your Site” report.

Google Webmaster Tool’s linking reports are the largest source of accurate, free linking information available. Google will provide up to 1,000 rows of data for verified sites. Other tools like Majestic SEO and SEOmoz’s Open Site Explorer will report more rows of data, but do not report the links Googlebot or Bingbot have discovered. Instead, they rely on their own crawlers to independently index the Internet and report links.

Google’s “Links to Your Site” report shows the 1,000 domains that have the most links to your verified site. The “Your Most Linked Content” report shows which pages on your site have attracted the most links. And the “How Your Data Is Linked” report shows anchor text used to link to your site, though it neglects to include any numerical data to indicate which anchor text is used most frequently.

The “Internal Links” page identifies which pages on your site are most frequently linked from your own site. This can be a very interesting report because internal links give search engines clues to which pages the sites’ owners consider most important. If many internal links are going to pages that are low priority and few are going to pages that are higher priority, the navigation may deserve another look.

The last page in the traffic menu is “+1 Reports.” Unless your site shares the same heavily male and tech-savvy audience as Google+, this report will likely be fairly sparse. As Google+ usage ramps up — see “Google+: The Beginning of a Revolution?,” my article on that topic —and when you decide to market actively on Google+, however, this report will likely become more and more useful. Separate reports detail the impact that +1 shares have had on search results for your site, the number of +1s on each page of the site, and the characteristics of the audience affected by these social search results.

Optimization Tools Menu

The “Optimization” menu houses tools and reports to help webmasters optimize their sites. I discussed the “Sitemaps” report in the dashboard section above.

When used with caution, the “Remove URLs” tool can be very effective at removing content of a sensitive nature that never should have been posted in the first place from Google’s index. For less urgent URL removal, use the robots.txt file or another method to deindex pages instead.

The “HTML Improvements” page lists the basic updates that should be made to a site’s title tags and meta descriptions. For ecommerce sites, this page may be frustrating because the content management system may not have the ability to optimize some pages on the site. Do what you can, note the ones you can’t resolve due to technology constraints, and move on.

The “Content Keywords” report shows the relative frequency of keyword usage and significance of keyword usage. Frankly, other non-Google tools —like the Ranks Page Analyzer — give a clearer picture of keyword density and prominence, but the “Content Keywords” report is an interesting high level approach. There’s also an “Other Resources” page in the “Optimization” menu with links off to other tool sets that webmasters would be interested in like the Rich Snippet Testing Tool, Google Places for local search optimization and the Google Merchant Center.

Labs Tools Menu

Lastly we come to the “Labs” menu. These tools tend to be experimental or beta tools that Google is testing out before releasing them as part of the standard toolset. For example, if your site contains article or blog content by a verified author coded with the rel=author tags, Google shows “Author Stats” such as rankings, impressions and clicks for those pages. Other labs tools include Custom Search, Google Instant Previews, and Site Performance.

The best way to learn more about Google Webmaster Tools is to dive in and play with it. It’s free, relatively easy to verify, and packed with help topics and tips to speed you on your way.


Jill Kocher
Jill Kocher
Bio  |  RSS Feed


Get the Practical Ecommerce RSS feed

Comments ( 3 )

  1. Thiruvenkatam July 29, 2012 Reply

    Good article! More details we may receieve from Google website itself.

  2. johnyaeger August 5, 2012 Reply

    Indeed very useful for us to keep abreast with Google’s changes and also because it’s free. thanks for the update.

  3. Carlos Rivera August 10, 2012 Reply

    Jill, I always enjoy your articles. I always know I am going to learn something highly technical written at a professional level in easy to understand language.

    I also enjoy using Google Webmaster Tools, although I use it much less often than I use Google Analytics.

    I especially like all the new tools and improved design layout that Google has implemented in Webmaster Tools over the past couple of years.

Email Newsletter Signup

Sign up to receive EcommerceNotes,
our acclaimed email newsletter.

And receive a free copy of our ebook
50 Great Ecommerce Ideas