Performics’ POV: Google Content Farm Update


OVERVIEW

A recent Google ranking algorithm change—now widely referred to as the “Farmer Update”—has led to a shuffling of the deck in Google‘s search engine results pages as “content farms” (i.e. content aggregation and scraper sites) have been punished in the form of negative ranking changes and even search index removal. Google has made the decision to target a slew of sites in an effort to clean up its search results by weeding out sites that it feels may “ride the coattails” of others. By repurposing, using automated programs to scrape or implementing shallow, low-quality site content, these sites are in effect attempting to improve visibility and ranking—and are in violation of Google Webmaster quality guidelines. This document provides a timeline of the events that led to the algorithm change, discusses what has happened since, identifies how Performics’ practices remain unfazed and introduces strategies that have helped Performics’ client roster steer clear of any content-related algorithmic shifts.

PRELUDE TO THE UPDATE

As early as Jan. 21, 2011, Google Principal Engineer Matt Cutts wrote a blog post on the Official Google Blog titled Google Search and Search Engine Spam. Cutts touches on webspam in the post and how Google has improved its ability to detect spam on Web pages. Cutts then shifts his focus to content farms and states:

As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content.

One week later, Cutts announced on his personal blog that an algorithm change has been approved, adding that “slightly over 2% of queries change in some way, but less than half a percent of search results change enough that someone might really notice.”

As it turns out, the algorithm change did somewhat go unnoticed in the mainstream, but that didn‘t stop search engine startup Blekko from making headlines three days later on Jan. 31 when it provided a list of 20 sites—each deemed a source of webspam by Blekko—that it was banning from its search results. Google responded two weeks later on Feb. 14 when it announced on its Official Blog that a new Chrome browser extension would allow searchers to block sites from their personalized search results. The extension would benefit users by providing more unique, relevant results and benefit Google by sending it blocked site data for further analysis, with the search engine adding, “[W]e will study the resulting feedback and explore using it as a potential ranking signal for our search results.” 

Google didn’t need much time to study the results, and just 10 days later on Feb. 24, Cutts, along with Google Fellow Amit Singhal, published a post on the Official Blog titled Finding More High-Quality Sites in Search that introduced what’s now being called the “Farmer Update.”  Within the post, Google describes the changes as:

a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of our queries—and we wanted to let people know what’s going on. This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.

AFTEREFFECTS: THE HITS

Many experts in the SEO Webosphere—including news outlets and well-known bloggers, along with site owners who have been directly and even indirectly affected—have responded to the algorithm change with vigor in blog posts, articles and across social media sites and message boards. The hardest hit sites appear to be the content aggregators and scrapers—many previously identified on the aforementioned Blekko “source of webspam” top-20 list—that may have been on Google’s radar since hints of the algorithm change first became public knowledge. Negative ranking data have in recent days been shared by a variety of online sources, and most indications are that popular sites like ezinearticles.com, associatedcontent.com, articlesbase.com, mahalo.com and business.com have experienced significant visibility and ranking declines. Interestingly enough, some of the ranking data that’s been shared has indicated “winners”—i.e. sites that have benefited traffic- and ranking-wise from the algorithmic misfortunes of others—and topping the list are sites like amazon.com, wikipedia.com, youtube.com, ebay.com and facebook.com.

What the Google algorithm change has clearly indicated early on in regards to the most affected sites is that their content either isn’t unique or isn’t relevant or valuable enough to be ranking as highly as it has been for a variety of targeted (and sometimes unrelated) queries. Also, it’s likely that many sites have also been penalized for having link portfolios that include backlinks from sites that Google deems as content farms; whether those penalized were aware of the validity of some of their backlinks is unclear.

AFTEREFFECTS: THE MISSES

Changes to the algorithm are never perfect, and this appears to be the case with this latest tweak. Just as sites labeled as content farms have been vilified over the past several days, on the other end of the spectrum is a bevy of sites that claims to produce original, relevant content that’s worthy of being displayed in Google results. An example of a site that has appeared in a multitude of articles is cultofmac.com, which according to its “About” page is a “daily news website that tracks Apple and the people who use its products.” On the surface, the site sounds like countless thousands of other online offerings that those interested in a particular niche would visit regularly for fresh, relevant content they’re passionate about. Cultofmac.com editor Leander Kahney expressed his frustrations to wired.com in a March 1 interview that also contains quotes from Google Fellow Singhal. In the article, Singhal admits that “no algorithm is 100 percent accurate,” and he adds that “any time a good site gets a lower ranking or falsely gets caught by our algorithm—and that does happen once in a while even though all of our testing shows this change was very accurate—we make a note of it and go back the next day to work harder to bring it closer to 100 percent.”

It will be interesting to see in the upcoming days and weeks if Google decides to re-tweak the algorithm in an attempt to correct any of the wrongs that have been pointed out since the initial change. Furthermore, the change is yet to target sites outside the United States, which means that its ripple effect hasn’t been fully dispersed across the Web—yet. It could take weeks or even months for everything to shake out.

HOW TO SURVIVE CONTENT-RELATED ALGORITHMIC SHIFTS

The content farm update has not had a direct impact on Performics’ client roster. This is in part due to our years of SEO knowledge, the best practices we follow and preach and the techniques we use to continually push our clients to reach their KPIs and engagement goals. Our Four Pillars of SEO—indexation, optimization, link building and distribution—remain as important and safe as ever, when executed to our high standards of compliance, and we intend to keep it that way. However, any site owner—regardless of how much or little SEO knowledge they possess—can be penalized by Google just as quickly as they’re rewarded. To avoid a similar fate that many of the sites mentioned here are facing, we recommend the following strategies:

Look in the mirror

Before you switch gears to focus on new ways to generate content, it’s important to self-audit your site to learn if the information you’re currently presenting to search engines and visitors meets your own brand standards and guidelines. Writing content “just to write content” is never a good rule of thumb, and if a searcher visits a page that has a blurb of “original” content but doesn’t fill a need or match their expectations, they’ll probably go elsewhere. Search engines aren’t likely to assign much authority to the page either, and if there’s a competitor site out there (and there will be) that offers richer, more valuable content about the same topic, they’re likely to perform better in terms of visibility and ranking.

Create unique content

When you add content to or create new pages on your site, ask yourself the following: If I perform a search using keywords that target this content, does this page provide me with exactly what I’m looking for?

A lack of unique content or presenting what Cutts refers to as “shallow or low-quality content” is what landed some sites in hot water in the first place. The lesson here is this: create unique, relevant content—as much as you can—and don’t stray from that path. As tempting as it may sound to grab content from here and content from there, what works best algorithmically is content that you and only you can call your own. But don’t stop there—an important follow-up step is the distributing of that content around the Web and having it link back to your site (more on this in a minute) from reputable social networks, partner sites, sites that offer related content, industry message boards and directories, etc.

Additionally, it’s important to keep tabs on your original content to make sure other sites aren’t using it for their own benefit. This of course can be a tall order if you produce a lot of content, and with the size of the Web, it doesn’t take long for content to be regurgitated time and time again. There are plenty of online plagiarism tools—some free—that can match instances of duplicate sentences and phrases. Google Alerts is another tool you can use to monitor your content, as you can type an original phrase into the “Search terms” box and have the tool e-mail you if/when that exact phrase appears in Google search results.

Keep tabs on your link portfolio

Well-known sites like overstock.com and jcpenney.com have been all over the news of late for taking part (whether knowingly or not) in suspicious link building practices that generated links to their sites from less-than-qualified sources. Much like unique, relevant content is important to successful SEO, having a strong link portfolio can often be what separates one site from the next when search engine visibility and ranking are involved. Since any site on the Web can link to another without permission, it’s important to keep one eye on your link portfolio at all times to make sure that the links pointing to your site are legitimate. Link quantity is definitely part of building a solid link portfolio, but don’t forget about link quality, anchor text and the context of the link origination site.

Know Google Webmaster Tools’ quality guidelines – and its features

Google Webmaster Tools is a free tool that any site can use to keep track of a wealth of information, including top search queries, crawl errors, links, keywords and much more. Within Webmaster Tools is the help section, which includes Webmaster guidelines that assist Google in finding, indexing and ranking sites. Google itself says it “strongly encourages you to pay very close attention to the ‘Quality Guidelines,’ which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise penalized.” Of particular significance to the content farm update are quality guidelines that focus on negative practices like loading pages with irrelevant keywords (i.e. spamming), duplicating content and creating pages with little or no original content.

Webmaster Tools also has a “Spam report” feature that provides site owners the opportunity to report any suspicious (e.g. spammy) search results they uncover. Google says it investigates every report and that severe cases can lead to the removal of sites from the index. If your site (or select pages) has been removed from Google’s search results, Webmaster Tools offers a “Request reconsideration” feature that allows you to contact the search engine directly. Before the submission, Google asks you to provide detailed information about what happened, what corrective actions have been taken since, etc. It can take several weeks for Google to re-evaluate a site, so be sure to provide it with as much information as you can, especially if you insist that your site uses (and always has used) white hat techniques.

Bookmark and Share


Comments are closed.

Performics Newsletter

[raw]



[/raw]