Losing traffic to your website is never something you want to happen, but it can and will happen from time to time. Small losses shouldn’t be a major concern. When the loss is drastic, however, that’s when you should take a closer look.
In most cases, giant drops in site traffic are caused by one of two reasons. Either your website received a manual action from Google, or your site took a hit from one of Google’s search engine algorithms.
These site punishments may sound similar, but they’re very different. While manual actions and algorithmic losses in traffic are often lumped into the same “I got a penalty” category, it’s important to understand what makes them different and why they happened in order to get your site back on track.
Manual Actions
Google isn’t shy about letting users know that their algorithms do most of the heavy lifting when it comes to monitoring website quality. But more often than you may think, Google’s webspam team will take the lead and place a manual action on a website that doesn’t adhere to their webmaster guidelines.
While the list of reasons for receiving a manual action isn’t necessarily extensive, some reasons are more common than others. The following are three of the more well-known reasons for receiving manual actions.
Unnatural Links to Your Site
For webmasters who are cognizant of where links to their website are coming from, unnatural links should be easy to avoid. Unfortunately, there are no absolutes in SEO, and the reality of the situation is that it’s difficult to know where every link to your website is located on the Internet.
If you do receive a manual action notification from Google and “Unnatural Links to Your Site” is noted as a reason for the action, don’t panic. You simply need to complete the steps defined by the search engine.
To have a manual action from unnatural links removed, Google requests websites do the following:
- Download a full list of links to your website. The only way to know which links are the root of the problem is to know where every link back to your website can be found.
- Begin looking for sites that link back to you often. If the same website keeps coming up in your list, AND they abide by Google’s webmaster guidelines, the chances of them being the culprit could be low, but always double check. Once the “good” websites are out of the way, it should be easier to find the “bad” websites.
- Request the Removal of the Links. Once you’ve made your way through the list of links and have located the links causing trouble, you’ll need to contact the webmaster of each website and request for the removal of your link. While some webmasters will grant the request to remove your link or add a “rel=nofollow” attribute quickly and without issue, others will resist. At that point, you’ll need to get links disavowed by Google.
- Disavow links. As last resort, Google’s disavow links tool will give you the opportunity to tell Google exactly which links would not be removed by a website and that they should be ignored by the search engine.
After you’ve completed the necessary requests for link removal (or had “rel=nofollow” attributes added to links), added remaining links to the disavow tool, and sent a request for reconsideration to Google, you’ll have to wait. In time, Google could either come back with additional information saying more unnatural or artificial links are still out there, or the manual action could be lifted. Keep in mind that coming back from a manual action brought on by unnatural links is possible, and many websites have done it.
Pure Spam
A manual action brought about by pure spam could be caused by a number of problems. Just as Google can hand down a manual action for either your entire website or for a partial match—meaning just sections of your website or specific URLs leading back to your website—pure spam manual actions are viewed in a similar fashion, affecting either a page, a few, or the site as a whole.
Below are just a few of the reasons why Google could deem your website to contain pure spam.
- Cloaking: By definition, cloaking is “the practice of presenting different content or URLs to human users and search engines.” While a regular user could encounter a relatively normal-looking page, the page read by search engines could be filled with keywords and key phrases, giving it a better chance to rank in relevant search queries.
- Doorway pages: By creating a page with the sole purpose of ranking for a specific search query and holds little relevance to the query in question, you’ve created a doorway page. Google sees these pages as both a reason for manual action and have also developed an algorithm to try and quell the use of these pages.
- Hidden text and/or keyword stuffing: Hidden text and keyword stuffing are old techniques that Google is quick to take action against. Hiding text or links by causing them to blend in with the background is easily noticeable with a simple CTRL+A. Keyword stuffing, similar to other old SEO tactics, can be caught by both a manual action or an algorithm, leaving websites that adopt the practice to face little chance of getting away with it.
If you do receive a manual action due to pure spam, you must fix the pages/website deemed as spam and submit a request for reconsideration. Like any other manual action, the chances of having it removed are still 50/50 because no matter how much you believe you have removed the problems, others may persist in links or less visited pages.
Thin Content with Little or No Added Value
Creating a website that will hopefully rank well in the future takes time and effort. The site needs to work first and foremost, but more importantly, it needs to provide its visitors with valuable information that they may be seeking. In the eyes of Google, an easy way to receive a manual action is by having thin content that has little to no value. What is “thin content” exactly? It could be a number of content types.
Google’s listed “common examples” include:
- Automatically-generated content. In order to avoid creating content specific to a page or website as a whole, someone may write a script that will add gibberish to a page with specific keywords filtered in. This method has numerous downsides, but the biggest of them all is how obvious it is. When you come to a page with content like this, you’ll know instantly.
- Thin affiliate pages. The pages found on affiliate websites, which tie into a larger network offering similar products, may contain the same product description or reviews. When the content is the same, no value is added to either the product being described or the main website offering the product. Affiliate websites and pages should provide more depth to what’s being described and should be kept relevant to increase the chances for a Googlebot to crawl the page.
- Scraped content. A website that uses scraped content—that is, taking content others have written and adding it to a website verbatim—is susceptible to manual action. Some websites do this in the hopes that pages with content that has done well will provide them with the same SEO benefit other websites have seen.
- Doorway pages. In some cases, doorway pages are created only to grab the attention of searchers with little to no relevant content available when you click through the link in search engine results. These pages are an item that Google is working hard to rid their results of in order to provide searchers with the best content available.
Removing the manual action brought on by thin content requires running an audit of your website’s content on ALL pages. Once the pages on your website have been fixed or unnecessary pages have been removed, a reconsideration request should be submitted.
Algorithm Changes. Not “Penalties”
Before going any further, we need to make the point that traffic loss that you see due to getting caught by an algorithm update or refresh isn’t technically a “penalty.” You may see that word tossed around online when an algorithm is being released with someone saying they got hit with an “algorithm penalty,” but it’s not technically a penalty.
When your website gets caught by one of Google’s many algorithms, the traffic loss you see could easily happen to another reputable website as well. This loss of traffic is due to your website not adhering to the criteria set by each algorithm.
Unlike a manual action, you cannot send a reconsideration request when you’re caught by an algorithm. The only way to recover from an algorithm release, update, or refresh is to go through your website and find what could be causing problems. From there, you’ll need to wait for further algorithm updates to potentially see a return to normal organic traffic.
To avoid getting caught in the crosshairs of one of the more well-known algorithms, you’ll need to know the reason why these algorithms exist.
Panda (4.0, 4.2)
Panda is Google’s algorithm dedicated to providing searchers with the best and most relevant content in search queries. Panda’s job is to weed out thin content, content found on content farms, top-heavy websites, and “a number of other quality issues”, according to Moz’s algorithm change history. Panda was originally released back in February 2011 and has since seen 28 updates and refreshes.
One of the higher profile websites to get hit by Panda came when Panda 4.0 was released back in May 2014. eBay took a major hit. eBay relies on promoting products through multiple categories and creating multiple pages for the same product, which often turn out to be doorway pages. The problem is that certain high-ranking keywords that eBay was found for in high search engine positions led consumers to pages with thin content, and it happened in multiple places, as many high ranking keywords could be found within different category pages.
Google does their best to provide searchers with the best content available, and to do that, the content needs to be worth seeing. Content that’s difficult to read (thin, poor grammar, keyword stuffed) or irrelevant to an industry could take the same kind of hit that eBay did, only it might not be recognized on such a large scale.
Panda can be easy to avoid, but as usual, nothing is for certain. If the content you create serves a purpose, answers potential questions thoroughly and thoughtfully, and is easy to understand for both users and search engines, the chances of getting hit by future Panda updates should be low.
Penguin
Google’s algorithm dedicated to ridding search results of websites that primarily gained authority through unnatural backlinking is known as Penguin. Some people try to make the Penguin algorithm out to be a link detector, but the original blog post about Penguin from Google says otherwise.
Unnatural linking, while a major component of Penguin, is only one piece of the puzzle. Penguin was created to promote websites that have consistently used “white-hat SEO techniques” and abide by quality guidelines, as Matt Cutts, Google’s former head of webspam, said in the post.
Keep in mind, though, that Penguin focuses heavily on getting web spam out of search results, and unnatural linking is a big component of having a website deemed as spam. If your website has built authority through link schemes, or you’ve placed your URL on untrusted domains but haven’t been hit with either a manual action notice or from an algorithm update, the chances of succeeding are going to keep diminishing with new updates and refreshes.
Mobile Update
Google’s mobile-friendly update, released on April 22, 2015, was the first of its kind. Not for the fact that it dealt solely with mobile-friendliness (even though that was a first), but for the fact that Google announced the release date in advance. With the early announcement, websites that weren’t mobile-friendly had the opportunity to avoid getting caught in an algorithm release by updating their website.
With the announcement came a digital marketing frenzy—what many in the industry dubbed as “mobilegeddon.” People began to think that if you didn’t update your website before the launch date, you’d be dropped from mobile search results. But upon its release, the mobile-friendly update didn’t hit as hard as everyone expected.
It’s never been explicitly stated anywhere, but the soft landing of the mobile update after the calls for the “end of days” left one to wonder: Was Google making the early announcement to prepare people for a major fallout, or was it just to get people to realize that it’s time to have a mobile-friendly website? After the slow rollout, which continues today, it can be easy to assume the latter. Google has the clout, and previous algorithm changes have hit hard, which caused many webmasters to worry.
In the end, the loss of traffic due to manual actions and algorithm updates have similarities for why they happen. Thin content, untrustworthy backlinks, creating spammy websites, etc. can all contribute to getting caught by Google or its algorithms, but remember that they’re not one in the same. Just because you didn’t receive a manual action from something like poor content, don’t assume Panda won’t catch it (and vice versa).
If you follow the rules, stay up-to-date on webmaster guidelines, and update your website when necessary, then you should be able to succeed in search engines. But unfortunately, as has been said before and will continue to be said, nothing in SEO is for certain.
Whether your website has been negatively impacted by a penalty or algorithm update, contact Hurrdat today to learn about our SEO services and what we can do to help grow your organic traffic again!