During a panel at this year’s South by Southwest, Google’s Matt Cutts mentioned that the search engine will be trying to “level the playing field” by rolling out an “over-optimization” penalty targeting sites that “put too many keywords on a page, exchange way too many links, or whatever else they are doing to go beyond what you normally expect.”
The move is Google’s latest in a string of changes ostensibly intended to improve the quality of sites searchers find when using the search engine.
[dmm_social tweet=”The people most often harmed by Google updates are those who blindly trust Google’s recommendations.” quote=”Unfortunately, the people most often harmed by Google updates are those foolish enough to blindly trust Google’s recommendations in the first place.”]
The Over Optimization Penalty: OOPs!
While the over optimization penalty (which we’re affectionately calling the OOPs update) has yet to be rolled out, Cutts’ description of what they’ll be going after is fairly clear and uncontroversial. But the fact of the matter is that most of the people spamming Google maliciously have advanced far beyond these kinds of tactics.
Keyword stuffing hasn’t been a widely used practice for half a decade.
Wide-spread link exchanges went out of favor several years ago as well.
The majority of people still using these outdated tactics aren’t evil SEO spammers, they’re business or website owners that are new to the entire concept of SEO!
The people most likely to be hurt by Google’s OOPs update have probably been introduced to these outdated and ineffective SEO tactics via Google’s own search results!
Yesterday Google released a video in which Maile Ohye examines what she calls five common SEO mistakes and the alternative methods Google would prefer site owners to use. The third mistake discussed is using rel=canonical tags on subsequent pages that point to their “page one.” As you can see below, Ohye discusses this tactic as if it’s some sort of sneaky or devious tactic used by SEOs and highlights the fact that using this method could harm your site by causing Google to drop content out of it’s index.
So where did webmasters come up with this “time consuming workaround?” Google told them to use it!
As you can see Matt Cutts specifically mentions navigation breadcrumbs as a situation in which rel=canonical should be used. Unfortunately, as Ohye now points out, that “could cause a loss of content in Google’s index”! Google can now point to the rel=next and rel=prev tags as the current “best practice” but that was only announced late last year. For over two years Google recommended that site owners use rel=canonical in a way that is now listed third in their list of common SEO mistakes.
The Parton Update
Another recent update (dubbed the Parton Update) announced by Google penalized sites that had too many ads above the fold. The rational was fairly simple, sites that make ads the focus of attention aren’t going to provide a high-quality user experience, and thus, should be deprioritized in Google’s rankings.
As a searcher, landing on a page and having to sort through a heap of ads just to find the content is frustrating and it’s easy to see why many people have no sympathy for site owners negatively impacted by the Parton update.
Unfortunately for users, Google is holding other sites to a higher standard than they place on their own. Only about 33% of Google searchers see a full organic listing above the fold while 90% see two large sections of AdWords ads.
And while Google’s search quality team might have a problem with sites placing lots of ads above the fold, the Google AdSense team certainly doesn’t. Google has been asking site owners for years to place AdSense ads above the fold and where they’ll be the focus of attention:
The Panda Update
Google’s year old Panda update famously spanked content farms, but only after sending them incredible amounts of traffic and paying them millions in AdSense revenue.
And while many such sites lost significant traffic, virtually every major content farm is still in business today thanks in no small part to their AdSense earnings.
Meanwhile thousands of sites that weren’t filling Google with voluminous amounts of low quality content continue to suffer from Panda’s effects. Reports of scrapers outranking the sites they’d stolen the content from became so common Google had to roll out an update to try and address the issue.
Many sites that had filtered their activities through the Google recommended filter of “Would I do this if search engines didn’t exist?” are now forced to undertake a plethora of tasks designed specifically to appease Google and regain the traffic they lost to the Panda update.
Perhaps no other example better encapsulates Google’s exploitation of webmasters who dutifully follow every guideline and recommendation the search engine makes, than the nofollow tag.
Google initially introduced the nofollow tag as a way for bloggers to combat comment spam. The following year, Matt Cutts then published an article recommending site owners expand the use of the tag to include any link they didn’t want to vouch for.
In 2007, Matt Cutts expanded the possible uses of the nofollow tag even further by saying “The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity.” In clarifying his comments, Cutts reaffirmed his sanctioning saying “why waste that PageRank on a page that wouldn’t benefit users or convert any new visitors?”
Nearly two years later Cutts discussed the issue again in the Q & A video you see below. In it Cutts specifically mentions using the nofollow tag for PageRank sculpting purposes and says “I wouldn’t necessarily do it with the nofollow tag, although you can put a nofollow on a login page, or something that’s customized where a robot will never login for example.”
While that’s hardly a ringing endorsement for the practice, it wasn’t condemnation for it either. In fact, Cutts mentions specific examples of where one might use the practice.
But just a few days later Cutts appeared on a panel at SMX and stated that more than a year prior, Google had changed the way the nofollow tag worked and it was no longer useful for sculpting PageRank.
For over a year sites needlessly flushed PageRank, time, and money down the drain because Google didn’t see fit to update their recommendations.
Remember The Past
George Santayana famously proclaimed that “those who cannot remember the past are condemned to repeat it.”
[dmm_social tweet=”Google’s recommendations and best practices have proven to be incredibly fickle and self-serving.” quote=”Over the years Google’s recommendations and best practices have proven to be incredibly fickle and self-serving.”]
A strategy or practice they recommend one day shows up in a list of common SEO mistakes the next.
One day a layout is recommended to help you make them more money through AdSense clicks and the next it has become grounds for a penalty.
The web’s history is littered with the shells of websites that fell victim to Google’s “Trust Us” penalty. Keep that past in mind as Google announces upcoming changes, lest you be condemned to repeat it.
image source: kennymatic