Google’s so-called “Farmer/Panda” algorithm update rewards original, high-quality site content. Aggregators suffer under Panda, as do ecommerce merchants who rely on generic product descriptions that are not unique to their sites, or who offer poor content generally. We’ve addressed the Farmer/Panda update here previously, at “Google’s ‘Farmer’ Algorithm and What It Means for Ecommerce SEO,” and with my three-part case study on one store’s struggles with that algorithm change.
The importance of the Farmer/Panda update has spawned discussions on structuring a site for search engine bots, versus making it accessible for human consumers. That discussion is the focus of a session — “Panda vs. Human: Advance eCommerce SEO & UX” — at the upcoming Search Engine Strategies conference in Chicago on Nov. 14 – 18. The session is potentially interesting, as it reminds us that products are purchased by humans, not search engine robots. But without the search engines, humans likely wouldn’t find a retailer’s products to begin with.
Armed with that dilemma, I corresponded recently with the two moderators of that SES session. Jaimie Sirovich is president of SEO Egghead, Inc., a web development firm focused on search-engine-optimized sites. Greg Nudelman is CEO at Design Caffeine Inc., a web design and user-interface firm.
SEO vs. User Experience
Jill Kocher: Google’s series of Panda algorithm updates focused primarily on site quality signals, such as unique, high-quality content. How do user experience strategies help send higher site quality signals to the search engines?
Jaimie Sirovich: “One of the most important, yet oft-ignored UX [user experience] factors when viewing a web page is its trustworthiness. Having links only for the sake of internal linking structure ruins trustworthiness. Even though one might attain more traffic, the conversion percentage and long-term loyalty both take a dive, because customers exhibit less trust and become weary.
“Since the algorithm is based in machine learning there is no simple checklist of Panda factors. [Machine learning algorithms are forms of artificial intelligence that enable the search engines to generate ranking rules or predictions based on observable data, such as click-through rates, amount of content, linking factors, and keyword relevance factors.] From our experience browsing patents, it is apparent that Google prefers machine-learning techniques to simpler algorithms like PageRank. They [machine learning factors] are more difficult to game.”
Kocher: User-friendly faceted site navigation — which filters variables like color, size and price — can generate thousands of repetitive product listing pages. Even when URLs are appropriately controlled with facet ordering and breadcrumb tactics, millions of pages can be spawned instantly. Won’t these faceted navigation landing pages result in just more thin content, which could trigger the Panda effect and hurt search rankings?
Greg Nudelman: “Absolutely. However, a certain amount of this is tolerated by the various search engines. Google has also introduced a new feature in its ‘URL Parameters’ tool within Webmaster Tools that allows for indicating parameters that filter results. One of the biggest mistakes we see web programmers make is to rewrite the URLs for all faceted search parameters, as if rewriting URLs magically solves all the spidering problems. Nothing could be further than the truth. Parameters helps the robots understand what they should spider.
“I’ve actually seen a number of search marketers say that perhaps faceted navigation pages shouldn’t be indexed at all. This is nonsense. Websites with faceted search capability tend to phase out what were deeply nested category trees in exchange for shallower category trees for fundamental decisions, and then facets for the details. This works very well for humans in every usability study I’ve read. The unfortunate result of that, though — if you don’t let the ‘bots spider at least some of the facets — is a major step backwards in your organic search marketing capabilities.
“We recommend not letting robots spider more than one facet at a time in some way by default to prevent spider traps — and if it’s not a landing page— but that’s exactly what using that URL Parameters tool might affect anyway, if the webmaster indicates that a particular parameter filters results.”
Kocher: Interesting, but I disagree. For ecommerce sites with very large catalogs, preventing the spidering of more than one facet at a time would essentially hack off the long tail and relegate most products to a crawl path made up of low-relevance pagination anchor text rather than keyword-rich facet anchor text. For example, allowing crawlers to access only pages for single facets like “red,” “womens” and “boots” individually only prevents the ‘bots from accessing a page dedicated to “womens red boots,” unless you specifically create or optimize a page for that phrase and allow the bots to crawl it. Introducing the manual element of creating specific landing pages removes the scalable benefit of faceted navigation to the long tail of search. The page exists anyway and users visit it willingly via the faceted navigation. So from an SEO standpoint it makes little sense to cut off those lower-volume but higher-converting long tail organic visits.
In any case, since we agree that many faceted navigation pages without unique content beyond title tags and headings could easily represent thin, low-quality content that may fall prey to the Farmer/Panda update, what scalable methods do you recommend to create unique, relevant body content?
Sirovich: “Our recommendation is to choose certain combinations of facets that represent high value to the business and the customer, and optimize those specifically as landing pages. The pages take little time to create and may be used both for organic search marketing and pay-per-click campaigns.
“Regardless, these pages need to be indexed, and one must help the robots crawl the right content. Even then, a landing page would be created without any added content, and when the site has an auto-updating XML sitemap it helps the robots find it.”
How to Build In-bound Links?
Kocher: Link building is another way to increase quality signals to faceted navigation landing pages. What are your recommendations for building links ethically?
Sirovich: “Anyone who promises ‘ethical’ link-building at scale is probably lying to you. The downside of getting caught for ‘unethical’ activity exceeds the benefits. Google succeeded in making automated methods of building links risky and short-lived. Those people we know who are doing it are sophisticated, expensive, and do it in coordination with more ‘ethical’ methods as well.
“When we code less and blog more, we get our share of Diggs and links from various other blogs. It does work. What we worry about is that there are areas of ecommerce especially where blogging or link building are pretty much impossible. If you’re selling something boring, what, exactly, can you blog about that is worthwhile or interesting? That’s a discussion for another day.”
Kocher: Search engine optimization and usability professionals tend to have a love/hate relationship. What tips can you give to ease the tension?
Nudelman: “When we speak to UX folks, we describe the robots as just another demographic with special needs. That’s a bit of stretch, but it fits. Robots have no intent. The goals of search engine optimization and marketing really do parallel those of usability. Both aim to ensure that a particular ‘demographic’ is able to view and understand the website. Both go hand-in-hand with holistic marketing, aiming to connect with users and ensure they tell their friends.
“Interesting enough, many accessibility recommendations also tend to ease issues for search engine robots. For example, when it comes to introducing sliders, Flex and other fancy user interface controls, accessibility suffers at the same time as SEO. And we find that simply throwing some fancy elements at the user interface often creates more problems than it solves, even for folks that do not require improved accessibility. For that reason, many etailers like Amazon will occasionally flirt with fancy user interface controls, but almost always revert back to standard HTML controls — like links and checkboxes. At the very least, when creating fancy user interface elements, good designers also make sure they provide ‘back door’ accessibility features, which also help people with disabilities — as well as robots — access the pages and improve search engine rankings. Some of the outdated design practices such as pop-ups and iframes are also considered harmful across the board, because they hurt both usability and SEO.
“Lastly, we’d add that the Panda ranking factor itself arms a usability professional with the argument that a website should not totally compromise its usability for the latest fad in ranking factors — not only because it stinks for users, but also because various aspects of usability are judged indirectly as part of Panda. Pointing that out might not ease the tension immediately, but it’s certainly a valid argument.”