Google’s “Trust Us” Penalty

During a panel at this year’s South by Southwest, Google’s Matt Cutts mentioned that the search engine will be trying to “level the playing field” by rolling out an “over-optimization” penalty targeting sites that “put too many keywords on a page, exchange way too many links, or whatever else they are doing to go beyond what you normally expect.”

The move is Google’s latest in a string of changes ostensibly intended to improve the quality of sites searchers find when using the search engine.

[dmm_social tweet=”The people most often harmed by Google updates are those who blindly trust Google’s recommendations.” quote=”Unfortunately, the people most often harmed by Google updates are those foolish enough to blindly trust Google’s recommendations in the first place.”]

The Over Optimization Penalty: OOPs!

While the over optimization penalty (which we’re affectionately calling the OOPs update) has yet to be rolled out, Cutts’ description of what they’ll be going after is fairly clear and uncontroversial. But the fact of the matter is that most of the people spamming Google maliciously have advanced far beyond these kinds of tactics.

Keyword stuffing hasn’t been a widely used practice for half a decade.

Wide-spread link exchanges went out of favor several years ago as well.

The majority of people still using these outdated tactics aren’t evil SEO spammers, they’re business or website owners that are new to the entire concept of SEO!

The people most likely to be hurt by Google’s OOPs update have probably been introduced to these outdated and ineffective SEO tactics via Google’s own search results!

Canonical Tags

Yesterday Google released a video in which Maile Ohye examines what she calls five common SEO mistakes and the alternative methods Google would prefer site owners to use. The third mistake discussed is using rel=canonical tags on subsequent pages that point to their “page one.” As you can see below, Ohye discusses this tactic as if it’s some sort of sneaky or devious tactic used by SEOs and highlights the fact that using this method could harm your site by causing Google to drop content out of it’s index.

So where did webmasters come up with this “time consuming workaround?” Google told them to use it!

As you can see Matt Cutts specifically mentions navigation breadcrumbs as a situation in which rel=canonical should be used. Unfortunately, as Ohye now points out, that “could cause a loss of content in Google’s index”! Google can now point to the rel=next and rel=prev tags as the current “best practice” but that was only announced late last year. For over two years Google recommended that site owners use rel=canonical in a way that is now listed third in their list of common SEO mistakes.

The Parton Update

Another recent update (dubbed the Parton Update) announced by Google penalized sites that had too many ads above the fold. The rational was fairly simple, sites that make ads the focus of attention aren’t going to provide a high-quality user experience, and thus, should be deprioritized in Google’s rankings.

As a searcher, landing on a page and having to sort through a heap of ads just to find the content is frustrating and it’s easy to see why many people have no sympathy for site owners negatively impacted by the Parton update.

Unfortunately for users, Google is holding other sites to a higher standard than they place on their own. Only about 33% of Google searchers see a full organic listing above the fold while 90% see two large sections of AdWords ads.

image used by permission of

And while Google’s search quality team might have a problem with sites placing lots of ads above the fold, the Google AdSense team certainly doesn’t. Google has been asking site owners for years to place AdSense ads above the fold and where they’ll be the focus of attention:

[blackbirdpie url=”!/sugarrae/status/160154296875352064″]

In fact, Google’s now ironically named AdSense “help” section provides several different recommended layouts which could very well earn your site a penalty if implemented.

The Panda Update

Google’s year old Panda update famously spanked content farms, but only after sending them incredible amounts of traffic and paying them millions in AdSense revenue.

And while many such sites lost significant traffic, virtually every major content farm is still in business today thanks in no small part to their AdSense earnings.

Meanwhile thousands of sites that weren’t filling Google with voluminous amounts of low quality content continue to suffer from Panda’s effects. Reports of scrapers outranking the sites they’d stolen the content from became so common Google had to roll out an update to try and address the issue.

Many sites that had filtered their activities through the Google recommended filter of “Would I do this if search engines didn’t exist?” are now forced to undertake a plethora of tasks designed specifically to appease Google and regain the traffic they lost to the Panda update.

NoFollow Sculpting

Perhaps no other example better encapsulates Google’s exploitation of webmasters who dutifully follow every guideline and recommendation the search engine makes, than the nofollow tag.

Google initially introduced the nofollow tag as a way for bloggers to combat comment spam. The following year, Matt Cutts then published an article recommending site owners expand the use of the tag to include any link they didn’t want to vouch for.

In 2007, Matt Cutts expanded the possible uses of the nofollow tag even further by saying “The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity.” In clarifying his comments, Cutts reaffirmed his sanctioning saying “why waste that PageRank on a page that wouldn’t benefit users or convert any new visitors?”

Nearly two years later Cutts discussed the issue again in the Q & A video you see below. In it Cutts specifically mentions using the nofollow tag for PageRank sculpting purposes and says “I wouldn’t necessarily do it with the nofollow tag, although you can put a nofollow on a login page, or something that’s customized where a robot will never login for example.”

While that’s hardly a ringing endorsement for the practice, it wasn’t condemnation for it either. In fact, Cutts mentions specific examples of where one might use the practice.

But just a few days later Cutts appeared on a panel at SMX and stated that more than a year prior, Google had changed the way the nofollow tag worked and it was no longer useful for sculpting PageRank.

For over a year sites needlessly flushed PageRank, time, and money down the drain because Google didn’t see fit to update their recommendations.

Remember The Past

George Santayana famously proclaimed that “those who cannot remember the past are condemned to repeat it.”

[dmm_social tweet=”Google’s recommendations and best practices have proven to be incredibly fickle and self-serving.” quote=”Over the years Google’s recommendations and best practices have proven to be incredibly fickle and self-serving.”]

A strategy or practice they recommend one day shows up in a list of common SEO mistakes the next.

One day a layout is recommended to help you make them more money through AdSense clicks and the next it has become grounds for a penalty.

The web’s history is littered with the shells of websites that fell victim to Google’s “Trust Us” penalty. Keep that past in mind as Google announces upcoming changes, lest you be condemned to repeat it.

image source: kennymatic


  1. says

    Rather than going through the minutia of each Google update, many miss their overall message, which has remained consistent. Create great content aimed at your visitor and you’ll rank higher. It’s been said over and over again. Google is going out there and letting us know the big picture, as well as letting us know that the micro picture will change. They are also cluing us in to the fact they they are rewarding websites for using their products, be that +1, YouTube or Webmaster Central. Common sense also plays into the picture. Don’t blindly follow instructions, those will change. Design for the future, be consistent and you’ll be rewarded. Nothing surprising here. Just confirming.

  2. says


    Really enjoyed the post; great to see someone highlight the contradictions that either Google or Google employees make. The ‘parton update’ is a classic, site owners are damned if they do, and damned if they don’t. Personally I’d like to see Bing gain more share; then people will not have to worry about catching a cold if Google sneezes.

  3. Ben Cook says

    Brad, the only problem is that it’s NOT just about creating great content anymore. Google used to say don’t do anything that you wouldn’t do if search engines didn’t exist. Now they’re telling you to do things like place +1 buttons on your site, mark up your site in a way that makes it easier for them to consume your data and then cut you out of the picture while keeping searchers on their site instead of yours.

    As I mentioned in the post, many sites that “just made great content” got smacked by Panda and now have to spend time and money jumping through all the hoops that entails.

  4. Grace Morris says

    Maybe I’m only reading into part of the issue. I’m not quite connecting the contradiction between Maile’s video with Matt’s 2009 video. Does it just boil down to the overuse and over exaggeration of the canonical thus the “over optimization penalty”?

  5. Ben Cook says

    Grace, Matt’s video tells people to use rel=canonical for the same situation Maile’s video claims is an SEO mistake that can cause a loss of content in Google’s index.

  6. says

    …and not only that, she pits “implementing time-intensive workarounds” against “researching better, up-to-date techniques” as a false choice. How stupid and inefficient of us to spend all this time hacking our site together when we could be using the latest and greatest advice! But the work-around was implemented last year upon Google’s recommendation and no longer requires any time on our part (and wasn’t the theoretical beauty of SEO coding “best practices” that we could almost set it and forget it?), whereas staying up to date on (and implementing) all the newest markup tags they just threw at us does actually require a lot of time and effort.

    The days of the “make sites for visitors, not search engines” rhetoric are clearly over. Visitors don’t and never will care about rel tags, or how you submit XML sitemaps, or G’s +1 buttons.

  7. says

    Hey Ben,

    What an article – totally agree with you on this one – Things are not anymore only about great content, I know too many people now whose website with “great content” are lost in the middle of Google flow of information, not showing anywhere and asking why some website with not such great content and other spammy MFAs are trusting the SERPs in their niche.

    More and more Google updates have been proven to serve Google itself, this pattern has been even more obvious since Larry Page has taken back the Google ship commands (https update, removal of many Google lab tools, promotion of Google services at the expenses of third party websites – shopping engines for example) and the list goes on and on and on…

    Google behave like 1990’s Microsoft but they should be more careful because someday it going to bite back

  8. says

    I think the point regarding the ads above the fold is hypocritical of Google. however google advice does have to evolve and try things that may not work.

  9. Richard says

    Making great content is for fools. It doesn’t attract links. Linkbait attracts links – political, controversial, above all populist content.

    Ranking is a popularity contest. Links.

    Great content e.g. medical sources is routinely pages behind the same scraped content on authority domains, or rehashed but ‘fresh’ content. Find top-quality content and there will be pages and pages of SERPS using that content before you find the original source, even if it’s an .edu page.

    You might also want to figure out the cost implications of so-called ‘great content’ for an e-commerce site with thousands of products, bearing in mind that no matter how ‘great’ the content for an individual product it it won’t attract links from customers and it WILL be scraped. For an e-commerce site on thin margins great content isn’t an option, even if it paid. For an e-commerce site of a certain size e.g. not a niche market there isn’t the time or money to throw at ‘great content’ and it becomes a matter of regular ‘good enough’ content to show that the site is alive, off-line marketing and a few low-hanging links (directories, industry bodies).

    Great content simply doesn’t attract links. Everyone wised up to the power of links five-seven years ago and now it’s nepotistic linking, micro sites, purchase of old domains for their authority and inbound links profile, blah blah blah.

    Read around, don’t give crap advice.


  1. […] During the 2012 SXSW, Google announced an upcoming update to level the playing field of “overly optimized” website versus website with great content.  We do not know if there is an official name for this update of “Over Optimization Penalties”, so for lack of better acronym I will refer to it as the “OOPs” update in this article.  Ben Cook also dubbed this the OOPs update ( […]