Recover from Google Fred Update Penalty (Twice) – A Case Study
This website got hit by Fred twice. Read here for the detailed account of what I have done to recover my website from the Fred penalty.
Recently this very website experiences the dreaded Google Fred Penalty (not once but twice!) and the traffic was down by 90% in the most serious case. This was devastating to this website which starts out as a place to share my knowledge and thoughts about web development and experience in getting professional certifications (e.g. PMP®, PMI-ACP®, ITIL®).
Below are the web and search analytics of the period from early March till early May 2017
Just as many other webmaster being hit by Google penalty would do, I couldn’t but keep asking the questions: “What have I done wrong?” and “What the hell is this Google Algorithm Update is about?”
The good news is it has almost recovered. Below is my detailed account of what I have done to recover from the Fred penalty. Hope this would be useful to other webmasters who are still struggling with Google Fred.
Is the website really hit by Google Fred Penalty?
— Gary Illyes ᕕ( ᐛ )ᕗ (@methode) March 9, 2017
From early March, there were massive shakeouts in search results rankings of a lot of websites and many suspected Google had rolled out a new form of update that would affect search engine result pages rankings. This website was initially left unharmed.
But beginning Mar 16, there was a sharp drop in Google referral traffic. There were virtually up to 90% less referral from Google and the rankings of my pages went beyond 100 in most cases (which essentially mean NO traffic at all).
“What have I done wrong?” — I actually did not make significant changes to the website in over two months’ time (just posting regularly), nothing new would trigger Google penalty with current algorithms. And as Google did not specifically disclose anything about the update, I could only make educated guess by correlating the time, seriousness, causes, etc. of the hit to that of other websites.
And it is quite logical to come to the conclusion that my website is being hit by the new Google algo update.
What went wrong?
— Dustin Woodard (@webconnoisseur) March 23, 2017
I followed almost every tweet and post related to Google Fred during this period, trying to understand what the algo change was about and how to sort things out. The most popular hypotheses at that time were:
- Fred update targets websites with many pages containing a lot of advertisement, outdated, thin and scraped content and incomprehensible articles for SEO purposes (source)
- Fred update hits low-value content sites that focus on revenue, not users (source)
Later on, in the Google Webmaster hangout on 24 March, John Mueller revealed a bit more on the new Google algo update:
Essentially, if you are following the Google guidelines and you are doing things technically right, then that sounds to me like there might just be just quality issues with regards to your site. Things that you can improve overall when it comes to the quality of the site. (source)
“What the hell is this Google Algorithm Update is about?” — These all point to the fact that Google Fred is about website quality.
Nothing NEW indeed.
But that’s the most difficult situation to handle. Had I knew exactly something that I did wrong, I could amend it right away (though in many cases not easily). But if there is no need guidelines to follow, I just could not make sense of the situation.
What have I done WRONG the first time?
Quality, quality, quality…
And from what I have read from others, quality is virtually about everything.
I need to dig deeper.
As the “Fred Update” is concerned primarily about quality, according to past Google algo updates and by reading the Google Webmaster and Quality Rater Guidelines, I have decided to tackle the following aspects as my first attempt to recover from Fred:
- Advertising — reduced the number of Adsense ad from 3 to 0 and remove Amazon widget (I read from many blogs that removing all ads is not necessary, but since I badly want my website back in rankings, I decided to be drastic at first (same for below))
- Content value — going through all the recent blog posts by either
- combining posts that are similar in content focus (remember to 301 redirect the remove posts)
- enhancing post contents by adding additional content / insights
- updating post published more than 2 years ago with latest figures and facts
- editing the post contents as natural as possible (e.g. by including alternate forms of the keywords)
- removing duplicate contents technically and the “tags” pages
- removing many of the Q&A pages which are all quite short
- Page layout
- reducing almost half of the links on each pages by doing away with “recent comment widgets”, “recent posts widgets”, etc. (just the recommended articles widgets there now).
- Technical SEO
- fixing as many errors reported in Google Webmaster account as possible — I feel that this is extremely important
- Internal links — going through each post and interlink to other posts on the website that would give additional insights on the topic discussed. I especially focussed on reducing the bounce rate with this step.
- Backlinks — asked to remove / disavowed website backlinks from domains with low authority or spamming records, in particular those gained between Feb and Apr this year
- Affiliate links — reduced the total number of affiliate links on my website
- (page) Authority — as the website was changed to https from http in early Feb, I suspect this maybe one of the causes. I try to tweak the addthis widget to get the Facebook share counts from the http URL — but I would not suggest others to do this…
- (my) Authority — participated in many discussion forums and groups (e.g. LinkedIn, Quora, Facebook) by contributing my knowledge and expertise to help other web developers and certification aspirants
- Website speed — my WordPress website makes use of a caching plugin which needs a special setting for https (which I did not know) and my site speed slowed down almost by half when I made the conversion to https. My solution? Switch to another caching plugin (WP Super Cache) that works faultlessly with https.
It felt like forever but the recovery took 3 weeks’ time. Thanks God. But…
What have I done WRONG the second time?
After seeing the recovery in Google referral traffic, I thought I need to continually enhance the quality of the website. And this time, I decided a bold move to go to AMP as AMP is highly advocated by Google. I hoped this would move my website into the “safe” zone even more.
And as soon as the AMP pages are on line, my website was punished again!
The figure below shows the number of AMP pages and errors in those pages detected:
The two lines are not plotted on the same scale. There were several hundred AMP pages detected with only around 12 pages with error. But it seems that this kind of error is enough to send a “low quality signal” to Fred though. My website was begin punished again.
Another tough war to fight:
- Removed all the AMP tags and codes — it was also discovered that the bounce rate of AMP pages were much much higher than the responsive website pages (sometimes even 100%), and that is why I do not implement AMP now
- Continued participating in forums and groups
- Continued updating website contents to make them more valuable and fresh
and waited for around 3 weeks to see rankings and referral traffics coming back.
Why 3 weeks in both cases? I have no idea at all.
Update: Some websites have said that Google refresh the Fred algo once or twice every month to revisit the penalized websites to re-assess if the penalties are to be lifted. Probably that’s the underlying reason.
My two cents on what Fred is
With the experience of being hit by Fred twice in a row, I would highly suspect that Fred is indeed nothing new — it does not target new metrics that are not in Google algo before. The following are still very important:
- Panda (thin content)
- Penguin (backlinks related)
- Top Heavy (ad heavy)
- Payday (money earning affiliate)
But Fred is likely to be a combination of all these signals to give a “threshold-type” penalty. That means if a spammy website that can luckily survive many or all of the above Google penalties with a thin margin, Fred now comes to the party to do the maths on all these signals and come up with a final quality score. It has the final say on whether the website is a spam or not by giving out the life threatening penalty.
Just my guess. Of course, it is likely that I have guessed Fred wrong.
Anyway, I am thankful that my website has now recovered (from Fred?).