SEO sanity check part 1: Google’s Penguin and Panda updates

ThumbnailSEO has always been a tricky business, not only do experts have to spend time on researching keywords and following the best practices, they have to be prepared for the changes which search engines inevitably put into place.

Last year saw search giant Google make two major algorithm updates — Panda and Penguin — that saw many a site plummet down the rankings, as they were penalized by the new rules.

This was because the changes were implemented in order to rank poor quality sites, such as content mills and link farms, down and give more weight to sites that produce quality content.

This is carried out by making changes to how Google’s spiders recognize a site, giving better rankings to sites with quality, well written content and social media engagement. For web professionals, this created something of a panic, as static sites that were not particularly well written and stuffed with keywords began to fail.

Penguin and Panda updates relied on a new set of rules and more complex algorithms designed to rank a site on a number of different factors.

These include:

  • Content: Google spiders can now tell if a site is badly written, with spelling and grammatical errors, lots of ads and bad quality links. This change is seen to be a welcome one for many SEO and digital professionals, as it immediately knocked poor quality article syndicates and content mills down the ranks, so that high quality could take their place and be more useful to searchers.
  • Freshness: the “freshness” of copy has become more important to Google than inbound links. This means that in order to compete on Google, it’s necessary to add new content often. The freshness ranking looks at 3 key areas: #1: Trending topics such as the Olympics or US election #2: Recurring famous events such as the Superbowl #3: How recently the content has been added.
  • Unique content: ever copied and pasted some content into a website to cut corners? Now it will also cut the site’s ranking. Original content is one of the most important aspects of determining position. Content containing unnatural links will also be penalized, so it’s important to ensure that links appear organically and are very relevant to the content. This is only going to become even more important as Google’s Author Rank takes off.
  • Social: as many of you will know, social is the new marketing and is a very powerful tool for SEO. Google now uses social in search results to determine just how useful a site is across the board. It’s important now for online marketers and SEO experts to include social, ensuring that all brand colors and logos are uniform across social channels and websites. Additionally, it’s important that the social presence is well managed; badly, bot-managed social will harm a site’s rankings.
  • Free from technical errors: this in particular is important for web professionals, and will no doubt knock a lot of blogging sites off the top perch. A site that has a sound architecture will perform better than a site which is built off templates, Flash, or is more than two years old. This means that code should be standards-based with valid CSS tags and tidy meta data.

How to address problems with a site’s ranking

Even some of the biggest sites were affected by the changes to Google algorithms, I read of one which had to be stripped right back in order to change all of the keywords and duplicate pages.

A site that is poorly written should have all of its content refreshed, preferably by someone who can write. This includes blog posts and articles, so if a site has lots of content like this, then it may be a better idea to strip it all from the site and add as you get it, or different content, written.

Meta data also has to be clean and tidy and Google tends to ignore keywords and concentrate on descriptions here. Keywords of course still have their place and it’s important to ensure that these are still well researched and analyzed, but articles and blogs with a high keyword density are likely to be penalized. This is because keywords, when overused, tend to compromise the quality of the writing.

Panda concentrated on getting rid of those sites which attempted to “trick” its algorithms with the overuse of keywords and link spamming. If you’ve determined that a site has spam links pointing at it, use Google’s Disavow Tool, which will remove them for you. However, it’s important at this point to note that a careful site audit should be carried out to identify bad links and it should be with caution that the tool is used.

For Panda, it’s also worth checking that a site’s content is unique; it has to be 60% unique site-wide, as well as accessible, in order to pass Panda’s rules.

Penguin concentrated more on the actual content and both algorithms are still updated regularly in order to refine them. For the most part, Penguin concentrates mostly on keyword stuffing within articles and spam links.

Essentially, they are both concerned with accessibility, content, spamming techniques and new rules that are designed to prevent black hat SEO.

What is black hat SEO?

Basically, this is a way of attempting to manipulate the search engines so that it essentially ‘tricks’ them into thinking a site is valuable. Black hat uses aggressive tactics and is geared towards the search engine, rather than a human audience.

Over coming articles, I will take a look at black, white and grey hat techniques in order to give a clear overview of which can be used safely and which are a no-no. The problem that many have found is that some, less than reputable, SEO ‘experts’ have employed black hat techniques in order to win more customers and make a quick buck. This is why some business sites have dropped like a stone down the rankings, often unaware that they have done anything wrong.

Black hat techniques include:

  • packing code with ‘hidden’ text;
  • link farms where a group of sites all link to each other to spam the index of a search engine;
  • blog spam, using the comments field on blogs and forums to place links to other sites;
  • scraping, a practice where one site takes content from another in order to appear more valuable to search engines;
  • doorway pages used with the intention of enticing searchers with phrases not related to site content;
  • parasitic hosting , where a site is hosted on someone else’s server without permission;
  • cloaking, a technique in which the search engine spider sees different content to the end user who views through a browser.

Black hat methods are seen by many web professionals to be unethical, as they use tactics that promise swift returns but run the chance of damaging a company’s reputation, website and in turn, profits.

Utilizing black hat methods often means that a site doesn’t have to wait months for link backs, as you would with traditional white hat methods. However, it also fills the internet with useless information and spam, and over the years has seriously affected search.

It’s also cheaper for the SEO strategist to carry out as often, a blog network will already be set up to link to and it doesn’t depend heavily on analytics and content, as white hat practice do.

Not only does employing black hat methods often lead to the threat of legal action, if they are used alongside a PPC campaign, heavy penalties can be incurred from the advertising host.

It’s not recommended that a site use black hat techniques due to the penalties involved, in terms of legal action, reputation and the threat of not ranking. However, no doubt that won’t stop everyone, despite the Google updates.

Saying that, we’re already seeing content mills dropping rapidly down the rankings, so the updates are obviously working as this is one of the key areas that Google wanted to address.

Google and all of the major search engines have a vision, one that intends to clean up the web and do away with bad practices, leading to more useful content appearing at the top of search for us all. Whether you use black hat techniques or not is between you and your conscience, but certainly I for one am glad of the ability to search and not come up with a page full of junk before I get to what I want.

What problems have you run into as a result of Panda and Penguin? How have you solved black-hat techniques employed by predecessors? Let us know in the comments.

Featured image/thumbnail, search image via Shutterstock.

Kerry Butters

Kerry Butters

A prolific technology writer, Kerry is an authority in her field and produces content for a variety of high profile sites in her niche. Also a published author, Kerry is co-founder of digital content agency markITwrite, adores the written word and all things tech and internet related.

Join to our thriving community of like-minded creatives!