Technical SEO

SEO: Managing Faceted Search

Faceted site search can help ecommerce shoppers quickly find the products they are looking for. But faceted search can shatter your search engine optimization efforts. What can you do, therefore, to manage faceted search, so that it benefits your visitors without harming your SEO?

Nonexistent Navigation, Pages, or Optimization

If pages don’t exist or aren’t optimal for search engine crawlers, the solution is to create something new that can be crawled and ranked for critical keyword phrases. Because redesigning templates, architecture, and navigation has so many ramifications on user experience, design, and development, the fastest solution is usually to create new work-around content.

  • If the page doesn’t exist or can’t be optimized. Let’s say you sell beauty products and you’ve discovered that “oily hair shampoo” is a search phrase you’d like to win rankings, traffic, and sales for. Your site’s architecture doesn’t have a category page featuring shampoos for oily hair. Instead, the site breaks in to gender-based categories and then hair types. There simply is no page that can be optimized for genderless shampoo for oily hair. In other cases, the page might exist but it can’t be optimized separately from its parent category page.

Either way, the solution is to create a page that can be optimized. Develop a content page, write some useful content that speaks to the issue and merchandise it with the relevant products. These content elements are necessary to give the page relevance and value for SEO, as well as to convert customers who land there.

  • If the navigation doesn’t exist. If the faceted navigation is not crawlable, or if creating new pages above has left you with orphaned content that can’t be indexed, you need to create a crawl path — links — to enable the crawl and pass link authority into the new pages. The quick solution here is to include a series of “frequently searched” links in the footer or as sitemap. The keywords would link to the pages that lack crawlable links, or even to internal search pages if your internal search is strong and can be optimized.

The way the links are designed and implemented is important because it’s easy to cross the line into link spam and keyword stuffing, which can trigger algorithmic penalties and lead to poorer SEO.

  • If the pages exist and can’t be crawled. You can also make certain to include them in the XML sitemap to prompt the indexation required for ranking. However, if the XML sitemap is the only path to discover these URLs, as orphaned pages they won’t have enough link authority to rank once they are indexed. Use this tactic in addition to other solutions rather than relying on XML sitemaps to solve the problem.

The ideal solution longer term is to work with your developers and platform support representatives to resolve the technical barriers that cause the navigation to be not crawlable. The short-term work-arounds above are manual and limited in scale, so they will not be able to address the large number of keyword ranking opportunities that your navigation cuts off. In some cases it may make sense to consider a big data content solution like BloomReach, or migrating to a new platform that can be implemented with SEO benefits baked in.

Duplicate Content

When faceted search produces piles of duplicate content, it splits link authority between multiple versions of the same page of content, creates self-competition for rankings, and may result in parts of the site not being crawled or indexed. I’ve addressed this here previously, at “SEO: When Product Facets and Filters Fail.”

In the short term, implementing a robots.txt disallow or a meta robots noindex command will take care of duplicate indexation. Both of these commands tell crawlers either not to crawl or index specified pages or folders on the site.

Disallows and noindex only take care of the over-indexation issue, though, and do not address the larger issue of splitting link authority. In fact, implementing them will render the fixes below less effective.

Longer term, to mend split link authority, you need to consolidate the link authority from the duplicate pages into a single canonical URL. Ideally you’d deal with all duplicate content issues with 301 redirects, but resources and customer experience needs can make that course of action undesirable. If the URL variants provide display differences that are useful to customer experience, a canonical tag can be used to request that Google pass the link authority from the duplicate URL to the canonical URL. If the URL is a pure duplicate and doesn’t need to exist for customer experience, 301 redirect the duplicate URLs back to the canonical URL.

Timing Is Everything

SEO takes time. Start today and implement as soon as possible. Planning and design are critical for established brands, but remember that the longer you work on a solution the longer it will be until your SEO improves.

Before rankings can change to drive increased traffic, crawlers will have to crawl the content that contains the changes, compare them algorithmically to the rest of their indices, and determine how to adjust your rankings accordingly. For major brands with content that changes frequently, the crawlers may visit multiple times a day. Static sites may wait a week or more until the next crawl. Make sure to factor the search engines’ timelines into your SEO launch plans.

For longer term SEO projects, start planning today so that your next big selling season finds your site in fighting shape.

Jill Kocher Brown
Jill Kocher Brown
Bio   •   RSS Feed


x