Technical SEO

SEO: Avoiding Penguins and Pandas

Google’s recent penchant for naming major algorithmic updates after animals has the world of search engine optimization sounding more like a zookeeper’s dilemma. But with rumors of an impending Penguin update — see this article from Search Engine Roundtable — right around the corner, ecommerce marketers need to know their Penguins from their Pandas, and how to avoid the ire of both.

In both cases, Penguin and Panda are algorithmic updates. Each represents updates to the hundreds of signals that Google uses to analyze and rank web pages for its search results pages. Penguin and Panda are primarily associated with negative impact on organic search traffic. Algorithmic updates are different than a manual penalty, in which human members of Google’s Web Spam team manually identify violations to Google’s webmaster guidelines and assess penalties on those pages. Because Penguin and Panda act algorithmically, if a site that has been demoted can identify and remove the issue, the site should be able to rebound algorithmically as well.

Most algorithms are being constantly tweaked and updated within Google’s main index. As a result, the impact of these continual updates isn’t felt strongly or suddenly as the algorithms evolve. Interestingly, Penguin and Panda are processed outside of the main index. Consequently, the updates to the rankings that these two algorithms produce are experienced in sudden bursts of change to rankings and traffic, lending Penguin and Panda their fearsome reputation.

Penguin: Over-optimization

Google’s Penguin algorithm update — first launched on April 24, 2012 and updated on May 26 — targets sites that have been over-optimized to an extent that they violate Google’s webmaster guidelines. For example, sites with large quantities of low-quality links and unusually high numbers of highly optimized anchor text, as well as sites that employ more typical keyword stuffing tactics or duplicate content. Google has found that these poor quality signals indicate sites that focus too heavily on the wrong kind of search engine optimization practices intended to manipulate rankings. Rather than actively penalizing the sites that the Penguin algorithm identifies, the update simply removes any positive ranking benefit that the tactics had been supplying the offending site. As a result, the site essentially loses the power of those tactics and drops in the rankings suddenly, which in turn decreases the site’s organic search traffic.

For example, say a site has 100 naturally earned links and works very hard over the next year to boost that number to 500 links. The 400 links come from easier-to-acquire methods like reciprocal requests, SEO directories, article sites, comments on blogs that may not be that topically relevant, and so on And since everyone knows that anchor text is powerful, it only makes sense to use strong keywords as the anchor text for those 400 new links they built, right?

Unfortunately, Penguin can detect this sort of activity as manipulation rather than naturally earned links. So the site that worked so hard to build 400 links suddenly loses the power of those 400 links overnight with the next Penguin update. It’s not a penalty, the algorithm finally caught up with them. But the impact is the same. That site is now back to relying on the original 100 naturally earned links to pass it authority and popularity, and its rankings and traffic drop.

Panda: Poor Content

Long before Penguin, however, Google released Panda — here’s the Wired interview — on February 23, 2011. Focused on dampening the rankings for sites that contain shallow content as well as content found on other sites, Panda was actually named for the engineer that created it. Panda has since undergone more than a dozen updates, and the impact of each has become a mere ripple in the fabric of the SEO world.

Panda was created because the web spam team at Google felt confident in their ability to detect the familiar forms of spam like keyword stuffing and random gibberish pages laced with keywords. Panda’s purpose was to spot content that wasn’t adding any real value to the Internet and remove its power to rank. Google is tight-lipped about what the criteria are for judging a poor quality. It claims, however, that it’s been able to algorithmically determine with high accuracy the sorts of sites that searchers tend to block manually from their own search results using the Chrome Site Blocker.

The most important step to appeasing Penguin and Panda is to focus on higher value link building and content creation techniques. Focus on long-term content marketing strategies rather than easy link building and content generation tactics designed only to boost your SEO. Yes, it’s harder. Yes, it takes longer. And yes, it looks an awful lot like creating great content and a great experience for real customers. Increasingly it’s also the only way to truly succeed long term.

Jill Kocher Brown
Jill Kocher Brown
Bio   •   RSS Feed


x