What You Need to Know About Google's Penguin 4.0 Update

 
 
By: Mahnoor Awan

On September 23, 2016, Google announced Penguin’s 7th and final update, calling it Penguin 4.0. Penguin is a filter that captures websites who use black hat webspam practices to boost their ranking on Google’s Search Engine Results Pages (SERPs). Black hat webspam practices are aggressive SEO tactics that include techniques like keyword stuffing, invisible text, and paid links, among others. Once the Penguin filter catches it, Google then penalizes those sites by lowering their rankings.

With this new update, Penguin is now part of Google’s core algorithm, along with 200 other unique signals or clues it relies on to rank websites on SERPs.

Penguin 4.0’s two major changes:

  1. Real-time Changes
  2. Previously, if sites were blacklisted by Penguin, they remained penalized even if site owners corrected issues and the sites were recrawled and reindexed, until Google resynchronized its Penguin data with the other algorithms, at intervals.

    With this update, now as soon as the robots crawl, they will see if a web page is using link schemes or keyword stuffing, or if they have good quality content and good user experience, and ranking results will be updated in realtime.

    This is also why Google no longer feels the need to announce Penguin updates.

    How often does Google's Robot Spiders crawl web content to update SERPs?

    According to Google Webmaster blog, the robots crawl based on an algorithm - so it’s computers controlling, not humans. This algorithm that takes robots to websites, is based on many variables like number of visitors, PageRank, links to a page, and crawling constraints such as number of parameters in a URL.

  3. More Granular
  4. Through this update, the robots will now monitor the content within the websites, and individual webpages. If it finds an issue with a certain page, the SERPs will adjust ranks based on offending pages, and not the whole site.

    So, for example, given Webpages A and B within the same site. If page A has good quality content and page B has keyword stuffing, when the robots crawl the website, page A won’t be affected in the ranks but page B will be blacklisted. If corrective measures are taken for page B, the next time robots crawl the website, page B will be removed from the blacklist.

How Does this update affects your SEO practices?

This update shouldn’t really affect your SEO practices as such. However, if through some error, one of your web pages are deemed as using black hat practices, it won’t necessarily affect your whole website. Further, even after you correct those mistakes, you won’t remain penalized until the next Penguin synchronization event. Instead, as soon as you make those changes, and the robot spiders crawl your page, the change will be seen immediately.

The best practices to follow here would be:
  • Constantly audit your links by checking for broken links
  • Perform clean-up where necessary
  • Create quality website content, focusing on brand building and great user experience
To learn more about building a brand, click here to read our blog.

Need help in creating quality web content? Contact UpOnline to learn more about our Custom Website Design services.