Matt Cardoni

Posted by Sam Battin, Senior Natural Search Specialist The Google update known as “Penguin 2.0” was rolled out and completed by May 22, 2013. This initial update targeted  the US and other English-speaking countries, and had an affect on 2.3% of searches. Since its roll out in these locations, it has been applied worldwide. shutterstock_126168530 (1) What Penguin 2.0 Targets Primarily the update targets “Black Hat” SEO tactics of using back link networks to artificially inflate Google’s perception of value and authority. Matt Cutts explains the broad strokes of the update here in this video published on May 13, 2013, just before the update was launched: The table below is based on the contents of the Matt Cutts video and provides expanded information on the black hat tactics in question, as well as the intended effect of Penguin:

Black Hat Tactic Name

What the tactic is

How Penguin 2.0 Responds

“Advertorials” Extended text advertisements that promote a specific product or service but which are misleadingly presented as an editorial or native article Reduced visibility for pages that violate Google’s published quality guidelines. “Advertorials” that are not clearly marked as paid content penalize the site’s visibility.
Link spamming Coordinated link networks that point hundreds or thousands of links with specific anchor text at a single page or domain Increased ability to detect “bad neighborhood” links. Additional penalties to sites that benefit from “bad neighborhood” links.
Hacked sites Illegal revisions to the code of legitimate websites so that they pass link value to spammy, low-value sites Quicker detection of hacks. Google more clearly communicates these hacks to webmasters and makes next steps easier.

For the most part, the Penguin 2.0 update only affects sites that use the tactics described above. Sites that do not use these tactics have not been widely affected.

The Effect of Penguin 2.0

The “Moz Cast” on the Web site displays large-scale site rank fluctuations as a weather report. If the temperature is high, then that indicates a large number of sites experienced a change in rank. In the article below, reports that the penguin 2.0 update is similar in effect to the Panda #20 update last year (9/27/12) but less than Penguin 1.0 (4/24/12). To use their vernacular, the average temperature, or relative fluctuation in site rank, is in the high 60s.  If Penguin 1.0 was a hot 93 degree day, then Penguin 2.0 was a more comfortable 80 degrees.

The degree to which the update affected search ranks differed in certain countries. For example, the Australian site “” registered a very high rank turbulence following the roll-out of Penguin 2.0.

The relative strength of a site’s inbound links may vary widely from country to country, and this could account for the higher degree of recorded turbulence in this instance. Google applies more value to links that come from within a site’s declared country. As a result, “bad neighborhood” link networks may have a relatively larger impact on site rankings in countries that have fewer total websites.

For Webmasters Who Were Affected by Penguin 2.0

According to most reports, the biggest losers in the wake of the Penguin 2.0 update were the sites that used the black hat tactics targeted by this update. Very few legitimate websites reported steep drops in rank. Small and medium sites that had an undersized link portfolio of relevant pages but large numbers of links from irrelevant pages were at the highest risk of dropping ranks.

In the months that preceded the update Google continued its efforts to improve communication paths to webmasters. Among these, the most important tool webmasters affected by Penguin 2.0 can use is Google’s “Disavow Links” tool, located here:

The “disavow links” tool represents an evolution in Google’s relationship with site owners. With this tool, sites can self-identify to Google the spammy links that come from bad neighborhoods. By disavowing the links, webmasters can publicly state that they do not want the value, if any, bad neighborhood links provide. This assists Google’s algorithms by providing additional signals for the reliability and authority of websites in its index. When a certain website is “disavowed” by a large number of site owners, Google will feel more confident in reducing the value the site’s outbound links pass.

For sites that have been affected by 2.0, disavowing the bad links is the first step in returning to the original ranks. The next step is an aggressive campaign to obtain good links from relevant sites that bring visitors to your site’s most interesting pages. From Google’s perspective, the high ranks of sites that profited from bad neighborhood links were “unearned”, and the sites must now work to obtain legitimate inbound links from authoritative sites that are strongly related to their industry.

Penguin 2.0’s Effect on Host Crowding In addition to fighting spam, Penguin 2.0 also improves the user experience on search engines, and this will have implications for site visibility on second-page and higher results. In his video, Matt Cutts refers to “host crowding” or “clustering”, which is what happens when different pages from the same website appear multiple times in the same set of results. For example, in a search for “Microsoft PCs”, the site appeared dozens of times because of its strong relation to the keyword. The Penguin algorithm adjusts the Google search results page so that the same site will only appear a limited number of times.

Google has decided that multiple results from the same site can be removed from SERPs to improve the user experience. For this reason, the Penguin 2.0 update included “cluster limits”, which reduce the number of times a site will appear in one search, even for branded searches. The site reported that following the adjustment, sites will typically appear a maximum of seven times in the top 100 results for a single search.

For example, the branded search “Bed Bath and Beyond Wedding Registry” used to include pages from 82 times in the top 100 results. Following the Penguin 2.0 update, now only appears in seven of the top 100 results.

The revision in search results pages means that there is an increased ability for sites to compete for branded terms. For a brand to maintain “real estate” in the search results pages, there will be an increased emphasis on management of off-site presence, such as official Facebook pages, Twitter feeds, and so on. A closer relationship between a brand and its affiliates and partner sites can insure that these “approved” sites will appear in branded search results. It is important to remember, however, that the majority of click-throughs will happen on the first page of results, and most of these will happen for the pages ranked in the top three.

Help For “Industry Authorities” Google has improved its ability to identify when content is produced by a recognized authority in a subject area. The Penguin 2.0 update helps webmasters by giving higher ranks to the content their algorithms recognize as authoritative and appropriate for users. At this time it is unclear what classifications Google uses to identify content authorities in different industries, but we have seen some evidence that established brands who have worked for years in the same vertical have improved their ranks following the Penguin 2.0 rollout.

Penguin 2.0’s Effect on Local Search Google has not announced any specific effect of Penguin 2.0 on local search, but some have noticed an increase in local results for generic search terms around the same time as this update was rolled out. For example, generic queries like “hockey”, “chocolate”, and “computers” included substantially more pages for local businesses and organizations in the top results.

It is not clear how long these new local results have appeared, and they may have preceded the rollout. Google could be measuring how including local results into searches affects click-through rates, and “normal” results may return shortly. Otherwise, our research does not indicate that the change in local search results has been conclusively linked with the Penguin 2.0 update.

June 24, 2013

Penguin 2.0 Update: Changes You Should Know

Posted by Sam Battin, Senior Natural Search Specialist The Google update known as “Penguin 2.0” was rolled out and completed by May 22, 2013. This initial […]
June 18, 2013

Think Like a Robot: XML Sitemaps

Posted by Sam Battin, Senior Natural Search Specialist Understanding how robots think is a good way to predict how your actions will affect your site. Remember, […]
June 6, 2013

Think Like A Robot: Improve Image Search Visibility

Posted by Sam Battin, Senior Natural Search Specialist Understanding how robots think is a good way to predict how your actions will affect your site. Remember, […]
May 22, 2013

Think Like a Robot – Relocation

Posted by Sam Battin, Senior Natural Search Specialist A problem I heard about this week moved me to write about the proper procedure for moving content on […]

Performics Newsletter