Practical Ecommerce

Google Plans SEO Over-Optimization Penalty

Matt Cutts

Matt Cutts

Google’s head spam cop Matt Cutts announced the impending launch of a new over-optimization penalty to “level the playing ground.” The disclosure came earlier this month at the South By Southwest (SXSW) conference in Austin, Texas during an open panel — entitled “Dear Google & Bing: Help Me Rank Better!” — with Google’s and Bing’s webmaster and web spam representatives. Google’s goal for the penalty is to give sites that have produced great content a better chance to rank and drive organic search traffic and conversions.

Pretty much all site owners can point to the search results for their dearest trophy phrase and point out at least one site that just shouldn’t be allowed to rank. Competitive ire aside, sometimes sites have poor content but focus extra hard on their search engine optimization efforts. These sites are easy to spot. They usually have a keyword domain, lots of keyword-rich internal linking, and heavily optimized title tags and body content. Their link portfolios will be heavily optimized as well. But their content is weak, their value proposition is low, they’re obviously — to human observers — only ranking because of their SEO. The upcoming over-optimization penalty would theoretically change the playing field so that sites with great content and higher user value rank above sites with excessive SEO.

What Qualifies as Over-Optimization?

No one but Google knows what, exactly, is “over-optimization.” However, Cutts did mention that Google is looking at sites by “people who sort of abuse it whether they throw too many keywords on the page, or whether they exchange way too many links, or whatever they’re doing to sort of go beyond what a normal person would expect in a particular area.” It’s widely believed that keyword stuffing and link exchanges are already spam signals in Google’s algorithm, so either Google intends to ratchet up the amount of penalty or dampening that those spam signals merit algorithmically or they have new over-optimization signals in mind as well.

5 Signals that Should Qualify as Over-Optimization

Because I can’t believe that the bits Cutts references are all there is to the over-optimization algorithm update, I’ve been daydreaming about what I would classify as over-optimization. Keep in mind that I have no inside knowledge as to what they’re planning. In other words, don’t run out and change all these things just because you read this article. But these tactics are on my list because they leave a bad taste in my mouth when I come across them and I sure hope they’re on Cutts’ list as well.

  • Linking to a page from that same page with optimized anchor text. If the page is www.jillsfakesite.com/flannel-shirts, and in the body copy of that page I link the words “flannel shirts” to the same page the words are on, IE www.jillsfakesite.com/flannel-shirts, that should count as over-optimization.
  • Linking repeatedly from body copy to a handful of key pages with optimized anchor text. If 33 of my 100 pages link to www.jillsfakesite.com from the body copy with the anchor text “Jills Fake Site,” that should count as over-optimization.
  • Changing the “Home” anchor text to your most valuable keyword. Usually the home link is the site’s logo. But in the cases where the home link is textual and has been optimized with the juiciest keyword, that should count as over-optimization.
  • Overly consistent and highly optimized anchor text on backlinks. If 10 of the 100 links to a page contain the same highly optimized anchor text, such as “Jill’s Fake Site, the Fakest Site Selling Flannel Shirts on the Web,” that should count as over-optimization.
  • Generic keyword domain name. They have way too much impact on rankings, and need to be demoted in importance. Now I’m sure it’s difficult to determine which words are generic and which are brands. But Google seems to have cracked that nut at least partially with its related brands results. Surely they must be close to understanding the difference between the non-branded domain littleblackdress.com and the brand whitehouseblackmarket.com.

Zoom Enlarge This Image

Google's related stores and brands search results for "little black dress"

Google’s related stores and brands search results for "little black dress"

So there you have it, my five least favorite over-optimization tactics, all of which I hope become algorithmic spam signals. Cutts’ transcribed comments on the penalty are below, but it’s worth going to the “Dear Google & Bing: Help Me Rank Better!” session page to listen to the entire recording. You’ll find the transcribed tidbit 16:09 into the hour-long audio clip.

According to Matt Cutts, “Normally we don’t preannounce changes, but there is something we’ve been working on in the last few months and hopefully in the next couple of months or so, you know, in the coming weeks, we hope to release it. And the idea basically is to try to level the playing ground a little bit. So all of those people who have sort of been doing, for lack of a better word, over-optimization or overly doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little bit more level. And so that’s the sort of thing where we try to make the website, the Googlebot smarter, we try to make our relevance more adaptive so that if people don’t do SEO we handle that, and then we also start to look at the people who sort of abuse it whether they throw too many keywords on the page, or whether they exchange way too many links, or whatever they’re doing to sort go beyond what a normal person would expect in a particular area. And so that is something where we continue to pay attention and we continue to work on it, and it is an active area where we’ve got several engineers on my team working on that right now.”


Jill Kocher
Jill Kocher
Bio  |  RSS Feed


Get the Practical Ecommerce RSS feed

Comments ( 10 )

  1. Ben Acheson March 22, 2012 Reply

    It’s about time Google cracked down on cheap and dirty SEO. Have you checked up on what your agency is doing?

    How to Spot a Bad SEO Agency:
    http://www.digivate.com/blog/seo/14-ways-to-spot-a-bad-seo-agency-infographic/

  2. Sudeep Farmvilla March 22, 2012 Reply

    Everytime after 6months google’s new panda always arive.
    i no longer care about panda’s update hitting my site,i just stick to the basics of link building
    Read here http://asvinstech.in/898/the-basics-of-link-building/

  3. Tom Pick March 22, 2012 Reply

    Bullet points #2 and #5 on your list could be problematic. If I have a page about a product on a website and I link to that page using specific text anywhere else on the site the product is mentioned, that’s just good UX, not necessarily over-optimization or any attempt at manipulation.

    And as for generic keyword URLs – sometimes this is done for valid, informational or educational reasons, other times for search manipulation. Search algorithms will need to take other factors into account in order to rank each type appropriately.

  4. Mike March 22, 2012 Reply

    I have to admit, I’ve used some of these tactics before. For example, when optimizing an Arizona Realtor’s website, I named the home page "AZ Home", or at least used that as the title attribute. I can’t quite remember. What I do remember is that it helped my optimization efforts, but I did feel a little dirty inside.

    As far as deep-linking goes, I couldn’t disagree more. Where Jill writes that linking repeatedly to a handful of key pages should be considered over-optimized, I really don’t understand why.

    Let’s say I’m a plastic surgeon and I do six things. Or at least, overwhelmingly so.

    So my main top menu reads something like [home] [breast augmentation] [lips] [eyes] [etc]

    Because I am a good and responsible blogger and site owner, I write a new post about plastic surgery every single day. It’s my own original thought, original content, with citation links included when I referenced someone else’s work.

    All of my pages, whether they are posts or pages are structured in the same way. To my knowledge Google can’t tell the difference.

    But if somebody is searching for Beverly Hills breast augmentation (can you imagine nailing that phrase?), I don’t want to confuse Google with other pages– or pages that Google thinks is more important. For example, having a particular blog post on breast augmentation– for whatever reason– rank better than my actual breast augmentation page. It happens.

    So I help to guide Google by indicating, with all of those links, that my main page on this topic can be found HERE, but that all of these articles support that page.

    It’s a way to help Google help it’s users.

    Should I really be re-thinking my deep-linking strategy?

    Any takers?

  5. Whoisb Whoisbid March 23, 2012 Reply

    The problem is that Google Plus spammy pages are starting to appear higher in search results because the site is a trust domain.. haha! what about that?

  6. Greg Percifield March 27, 2012 Reply

    I hope they get it right this time. When the Panda update was released last year, my eCommerce site was all but destroyed. Why? Because Google recognized publisher descriptions as duplicate content. Ever since I have found unrelated and poor results that have been ahead of us when searching for our primary keyword. What they failed to understand was that many publishers require their products to be advertised with their provided description.

    I will be very impressed if this turns that around for us. As an online site that has operated consistently for nearly 9 years, it is sad to see a business hurt so badly, because Google is testing algorithms. It is sad that we see a whole bunch of spam sites appear before our own for our primary ranking keyword.

    Please, Google! Get it right this time!

  7. xtremeux March 28, 2012 Reply

    Google’s all time favorite is relevancy & votes. Means how much relevant your page to the searched keyword and how many votes you get from external sources of course quality sources.

    I agree with Sudeep above. I just practice with natural SEO like keyword placing in Title, Meta description, H1 tag and content. This builds relevancy of between searched keyword & a page.

    Do some natural linking from relevant forums discussions, articles, web 2.0 sites and it really works.

    I also ensure that content is unique and written naturally around keywords.

    After Panda updates I give more importance to Content Ownership by adding the Author tag on page head.

    I think, if we keep in mind all above points while optimization we should not worry about Panda updates and its affects.

    Any suggestions or feedback welcome.

    Thanks.

  8. Strilets Irina April 3, 2012 Reply

    I’m afraid that soon google will be Overly-Smart with it’s algorithm changes :) … although it’s good that they fight with spammers

  9. Stephen Kline May 1, 2012 Reply

    Unforunately, the only websites this will help are the ones with huge SEO budgets like Amazon. Small businesses that are trying to put up great content are left behind without overusing the keywords to get ranked. I have spent over $1000 on SEO in the past 6 months and still haven’t moved past the third page so I might as well be on page 10000.

  10. Nitesh Ahir July 30, 2012 Reply

    I think there should be more than 500 signal to avoid spammers. Matt shus us major or the main which is been noticed by the poeple, other wise it’s useless put in implementation.

    Google is becoming very smarter each & every day. Even if you can see the chages in organic result.. it’s become harderd & harder for the seoo compnies. SEO company should thankful to MATT & HIS Spam Team because now these companies are so much busy with the work & to achive the results.

    Thanks,
    Nitesh Ahir

Email Newsletter Signup

Sign up to receive EcommerceNotes,
our acclaimed email newsletter.

And receive a free copy of our ebook
50 Great Ecommerce Ideas