“Ask an Expert” is an occasional feature where we ask ecommerce experts questions from online merchants. For this installment, we address a question about getting pages from an ecommerce site fully indexed by Google. It comes from Robert Mobberley, CEO of Performance Motorcare Products, a U.K.-based automotive parts and accessories retailer.
For the answer, we turn to Stephan Spencer. He is vice president of SEO strategies at Covario, an SEO technology firm, and a long-time contributor to Practical eCommerce on the subject of search engines and search engine optimization.
If you’d like to submit a question, email Kate Monteith, staff writer, at firstname.lastname@example.org and we’ll attempt to address it.
Robert Mobberley: “We have spent a lot of time and effort on our organic SEO and are getting good results on the keywords that have been indexed. However we are aware that there are still a large number of our pages that have not yet been indexed by Google. What in order of priority are the top five (or more) key actions that our SEO manager should be looking to take to ensure these pages are indexed as soon as possible by Google?”
Stephan Spencer: “There isn’t a one-size-fits-all answer to your problem. Incomplete indexation is a very complex issue that requires further investigation and analysis before a course of action can be recommended. How much PageRank have you garnered from other sites linking to you? And how big is your site (i.e. number of pages)? The bigger the site, the more PageRank you need, and the more efficient you need to be in spreading that PageRank around.
“A page that’s 10 clicks deep into the site probably won’t fare well in Google. If some of your pages have an overly complex URL structure (too many dynamic parameters, or too long), that could be impeding their inclusion in Google’s index. If there are too many redirects in a row, that could be causing the page to not be indexed. Or if you’re spamming. Or if you have an inordinate amount of duplicate content. This isn’t as simple as making a comprehensive list of URLs and dropping it into an XML Sitemap file. That’s treating the symptom instead of curing what ails your site.
“In your specific case I believe it’s a combination of factors. I wonder if the keyword-stuffing in your meta tags could be earning you a penalty (see https://www.performancemotorcare.com/acatalog/info3PMC00030.html. You only have 75 domains linking to your site (according to the SEOmoz Linkscape tool), so your site, in my opinion, doesn’t look terribly important.
“You have canonicalization issues. Specifically, some pages in Google’s index are from www.performancemotorcare.com, some from performancemotorcare.com, and some from https://www.performancemotorcare.com. This creates duplicate content and PageRank dilution. This can be solved with 301 redirects or with the canonical tag. None of these on their own is the ‘smoking gun,’ but they could be contributing factors.
“Could this possibly be as simple as a mistake in your robots.txt file? I found a disallow directive (‘Disallow: /cgi-bin/’) in your robots.txt. Yet you feature a number of ‘best selling’ and ‘new’ products on the home page; these URLs all contain ‘cgi-bin’ so they are all being disallowed. I found that very curious. Is that intentional?”