I will focus on the most crucial points (in my opinion):
- The nature of Google Algorithm Updates.
- How the Google Webspam team reviews work.
- Most likely causes for Google Penalties.
- Google Penalty removal approaches.
Understand the Source of Google Penalties, Google Algorithm Updates and Manual Penalties.
A Google Penalty identifies the negative impact on your website’s performance in Google Search Result Pages (SERPs). The reasons for that might be Google Algorithm Update and/or Manual Penalty. According to Matt Cutts (leader of the Webspam team at Google), they are trying to apply their best judgement to return the most accurate results for their users. Of that, regular monthly updates of the algorithm are implemented. Additionally, manual reviews on suspicious websites and pages are conducted. At the end of the day, you have two main concerns to be aware of:
- Does your website correspond adequately to recent Google Algorithm Updates?
- Is your content considered deceptive or manipulative?
To reasonably delimit my article, I am going to discuss the Google Manual Penalty issues in two parts. Firstly, I will talk about content, and secondly, deceptive link building techniques. The theme of Google Algorithm Updates will only be slightly touched upon.
Google Algorithm Updates might harm your website performance in rankings if violations of the Google Quality Guidelines exist. There are two important updates, out of many, you should be aware of – Panda (launched February, 2011) which observes for improper, low quality and deceptive onsite content quality and Penguin (launched April, 2012) which looks for websites that implement Black Hat SEO such as spammy link building, duplicate content, keyword stuffing, cloaking, etc. If you want to learn more about the real impact of these two updates I recommend you read the article: “Difference between Google Panda and Google Penguin” in SEO Updates.
Manual Penalty appears when a real person from Google Webspam team revises your content and discovers deceitful or spammy techniques used for higher rankings.
Both Google Algorithm Updates and Manual Penalty could lead to demotion or removal of webpages or entire websites. That’s why it is important to stay as far away as you can from bad practices and know how to fix possible implications. The first thing you have to do if you suspect unnatural behaviour of traffic is to check whether your website/webpages is removed from Google SERPs, or just deranked. This can be done by writing either your brand name or domain in the Google search box –site:domain.com. If there are no results, then you’re probably hit by something.
It is also important to look at the indexed pages Google shows after each conducted search. This is the small grey number appearing below the search box. If there is a huge difference between the actual number of pages on your website and the one by Google then you can be sure the missing pages are hit. It may also be useful to look at a few of your keywords that indicate decline in traffic in the Google Analytics report. If they’re not listed on their regular, leading, positions you have to re-examine the landing pages and links pointing to the same pages.
How to Check if Your Website is Penalized by Google
If you experience a rapid decrease in traffic, you should be suspicious. It’s as simple as this: if you examine Google Analytics data regularly and one day you notice a drastic downturn in terms of Organic Traffic, you should investigate further.
There might be two reasons for the decrease: recent Google Algorithm Update or Manual Penalty. But, in order to detect the real cause, you have to understand all the possible reasons prior to any actions that may follow.
How to proceed?
Open your Google Webmaster Tool and try to identify the problems by looking for notifications from the Webspam team:
- Go to Site Messages (Google should have sent you a message if they penalize your website for some reason)
- Go to Search Traffic > Manual Actions (Google should have sent you a message if you violate some of their Spam Guidelines)
- Go to Index Status (You will be notified if Googlebots have problems with indexing some of your webpages)
- Go to Crawl Errors (You will be notified if there are problems on your website stopping Googlebots from crawling the content)
Information on whether your website has crawling problems, update issues or manual webspam actions should be available in at least one of the sections.
And, if you wanting a deindexed, penalised or sandboxed* check up, I recommend you use the Pixelgroove SERP Tool.
*Sandboxing is not proven to exist but if it does, it applies only to new domains.
If it's Not a Google Algorithm Update Does that Mean it is a Manual Penalty?
Google is taking actions to decrease spam every day. They do this by removing content from its search results. Most commonly they deal with legal, security and safety issues. In the following I am going to provide you with a comprehensive checklist that will help you determine why you might be hit by a Manual Penalty and what to do to recover from it.
Thin Content Penalty
To be completely frank, this is something I observe all over the web. Thin content provides basically no added value for the user. If you have it on your website you should get rid of it.
So, let’s look at the most common types of thin content according to Google:
Doorway pages, also known as bridges, are poor-quality pages trying to rank for a certain keyword or phrase. Subsequently they funnel the user to other pages, known as hubs, and even websites using simple call-to-action or re-direct. Webmasters normally deploy such pages on different domains that are cross-linked somehow. In order to prevent duplicate content, different keywords are accompanied by supplementing words with no meaning, geo locators or synonyms to mislead search engines. Be careful not to have such spamdexing pages that actually strive to rank high, grab the searcher and re-direct him shortly afterwards.
Thin Affiliates are websites you refer to on your own webpages. The searchers are supposed to visit these pages and buy products. In the end, you benefit as an intermediary. If you want to refer to a website where people can buy something, you must enrich the content, but be careful not to copy the entire proposition to end up with duplicate and dull offers.
Thin syndication indicates content you take directly from article banks or their RSS feeds and paste it to your website. This is considered as duplicate content with no added value to the users. Do not copy-paste huge strings from Wikipedia either; this could also be also considered thin syndication. Be unique and create your own content.
Scraped content is a technique that is used to steal content from more reputable sites to increase the number of pages. It is essential that your content is original and provides value to the users. Copyscape and Siteliner will help you to check your site for duplicate content around the web. Content with less than 70% uniqueness will be marked as duplicate from search engines.
Cloaking is defined as deceptive technique by all search engines. Basically, it is used to serve one page to the user while showing different pages to the crawling bots. Let’s say the user is looking for “baby clothes”. The search engine provides him with a list of results and he clicks on one of the first ones. Unfortunately, he ends up at a porn website. How is this possible? The answer is, cloaking.