Nobody ever wants to experience that dreaded feeling you get when you realize your site is no longer ranking or you’re suddenly seeing a dip in traffic. You instantly go into what we call “SEO panic mode” and start frantically searching for a solution. 😨 Unfortunately, there is a multitude of different factors that could cause this to happen, such as low-quality content, a Google penalty, technical on-site issues, an algorithm update, or perhaps your competition is just beating you.
This recently happened to us here at Kinsta and it was the weirdest SEO issue we’ve ever seen! But don’t worry, we’re going to share the entire story with you. By being fully transparent and sharing our own struggles, we hope that in turn, it will help your business be ready. Check out the entire process we went through when we realized nothing new was ranking on Kinsta.com, along with how we recovered.
Troubleshooting Our Own SEO Issue
Google isn’t perfect. Yes, we did say that. Just like any other company, they do make mistakes every now and then, even when it comes to search engine results (SERPs) and their algorithms. They have even on occasion rolled out algorithms to fix previous algorithms.
Some of us on the Kinsta team have been doing SEO for many years, we put a lot of effort into this traffic channel and so we’ve seen quite a few unique problems, but what happened to us recently by far takes the cake! 🎂 In the end, Google actually made a mistake and left our own site (Kinsta.com) with an unwarranted manual action against it. Check out below how we came to this conclusion and exactly how the panic attack inducing situation played out.
Drop in Average Rank
As we’ve shared before in previous posts, we utilize Accuranker here at Kinsta to monitor how our content marketing is doing in SERPs. We highly recommend getting this tool! Due to the fact that we publish a lot of content, we check this data on a regular basis. On January 24th, 2018, we noticed a complete drop in our average rank sitewide.
Tip: This is one of the major reasons you should monitor your keyword rankings. Google Search Console data is delayed and troubleshooting these types of issues are harder in Google Analytics. Accuranker let us know there was a problem on our site right away with just a quick glance at the dashboard. It can also be an easy way to spot individual pieces of content dropping in ranks which perhaps need updates or improvements.
We then proceeded to check our keyword rankings across the board, but we didn’t see any issues at all. Therefore we had a hunch it was probably something to do with our newer content. Our assumption was correct because our newer posts indeed were showing a “0” ranking. Now, of course, not every post is going to rank on the first page of Google right away, but typically it should rank somewhere, even if it’s 200+. A ranking of “0” is not a good sign.
This can also vary for every site based on crawl rate and a bunch of other factors. If you publish regularly you’ll probably know or have an estimate of how long it takes a search engine to start indexing and ranking your content.
Checking Keyword Rankings on Individual Posts
Thanks to Accuranker we had a pretty good pinpoint on the date that the problem started occurring. So we then proceeded to check our blog posts around that date in Ahrefs. They provide a quick way to scan a page or URL and see the keywords it’s currently ranking for. You could also use Google Search Console to do this, but it isn’t as quick.
We started by going one by one and inputting our keyword rankings for each post going back about a month into a spreadsheet. This can help you spot a pattern, which in our case it definitely did. As you can see below, our posts typically rank for 50-1,000 keywords per post. But starting as of 01/26/18 nothing was ranking at all. 😨 So we weren’t just declining in rankings, we were just at a standstill.
Many of these were blog posts, but even our newly published landing page, with over 2,600 words wasn’t ranking. Landing pages should typically rank faster and even easier due to a better internal linking structure, footer links, higher page authority, etc.
We also had a column for indexed, meaning Google was indexing these URLs just fine, just not ranking them for anything. You can quickly check if a URL is indexed by simply dropping it in Google search with the site:
parameter before it, such as:
site:kinsta.com/blog/best-seo-plugins-for-wordpress/
If your URL comes back in SERPs then it means it’s indexed.
Some of the posts had over 4,000 words, so we knew something was majorly wrong. Google knew they were there but it’s like there was something blocking them. A few other things we noticed:
- It was only impacting our new content. Everything else was continuing to rank and even gain more organic keywords as we improved and updated the content. This was strange as we had never seen an issue just with part of a site before.
- It was also impacting our new content on the Spanish version (kinsta.com/es/) of our website.
- We had already recently disavowed a lot of spammy domains as part of a regular SEO checkup on our site. In fact, we documented our entire process of how we fight content scraping. So as far as backlinks go, we were probably on the cleaner side than most. Or at least a better place than we were three months prior.
- The only SERPs update around the time the issue started occurring was Google Maccabees. This was pushed out in mid-December. However, this really only impacted sites affiliate sites and those with keyword permutations. Our site didn’t apply in either of these cases.
- We did check our robots.txt file, which was normal.
Manual Action
So we went to look in Google Search Console, and in fact, under “Search Traffic → Manual Actions” we did indeed have a manual action against our Kinsta.com site. So first off, we were a little surprised as we never received an email about this, nor did it show up under the “Messages” section in Search Console. Typically this is the first place you want to look.
But wait, it got even stranger! As you can see above, the manual action was for “Hacked sites” and “Pure spam.” The subdomains in question were domains that had been removed over a year ago. So now we were really confused.
That and the fact that Google doesn’t put any dates on their manual action reports. 😡 This is very frustrating and definitely needs to be improved. Without a date, we couldn’t 100% say for sure that this had anything to do with our current ranking issue. For all we knew, it was a bug in Google’s algorithm that was simply impacting only new content we published.
Filing Reconsideration Request
We of course immediately filed a reconsideration request as the manual action in place was against subdomains that didn’t exist. The problem with reconsideration requests is that these take days and sometimes even weeks. And there is nothing you can do but twiddle your thumbs and wait. For anyone in a situation like this, it can be downright nerve-racking.
Reaching out to Google and SEO Community
Instead of doing nothing while we waited for Google to look at our site, we decided to reach out to the Google and SEO community. The first thing we did was post on the Webmaster Central Help Forum. This didn’t help us at all, and to be honest, we didn’t expect anything to come from it.
Next, we reached out to some of the top SEOs in the industry (many of which do consultant work) to see if they had ever heard of such a weird issue happening and could be of any assistance. We put together a list. Feel free to use it, just don’t abuse it.
- Bill Slawski – seobythesea.com – @bill_slawski
- Marie Haynes – mariehaynes.com – @marie_haynes
- Cyrus Shepard – cyrusshepard.com – @cyrusshepard
- Glenn Gabe – gsqi.com – @glenngabe
- Dan Petrovic – dejanseo.com.au – @dejanseo
- Barry Schwartz – rustybrick.com – @rustybrick
- AJ Ghergich – ghergich.com – @seo
- Alan Bleiweiss – alanbleiweiss.com – @alanbleiweiss
- John Mueller – @johnmu
A few of the above responded by saying they were too busy which is completely understandable. Many of these individuals probably get hundreds of crazy requests per day. Others came back with answers which we thought were quite odd and didn’t really apply to our situation.
To our surprise though, three did respond with great advice and were more than willing to jump in and help: Glenn Gabe, AJ Ghergich, and Cyrus Shepard. We first want to thank each of these individuals for the time they spent with us troubleshooting to make sure the manual action was indeed the cause and that there wasn’t something else going on. Some even tried helping while traveling halfway around the world!
They helped us confirm that nothing else looked off and all we need to do was wait for the reconsideration request to go through.
Reconsideration Request Rejected
So nothing ever comes easy when it pertains to SEO! We submitted our first reconsideration request on February 13th, 2018 and it was rejected on February 23rd, 2018 for the following reason:
We’re not able to review your site because the hacked pages on your site are returning server errors (for example, 5xx). You may need to contact your hosting provider to see if there is any issue with the server. After you fix your site so we can access it, and you’re sure that the hacked content has been removed (sends a 4xx response), go ahead and file another reconsideration request so we can take a look.
The errors of course happen because the subdomains haven’t existed for over a year. The advice to contact our hosting provider wasn’t of much use either, for obvious reasons. 😕 Even though we are a Google Cloud Platform Partner, it’s important to understand that when it comes to SEO, we are just like everyone else. While it can be frustrating, we don’t get any special treatment. This is, of course, to keep SERPs fair for everyone.
We immediately filed another reconsideration request on February 23rd, 2018 with a much longer explanation. We were then finally approved on February 28th, 2018. Tip: Be very detailed in your reconsideration requests.
As you can see, reconsideration requests aren’t a speedy process, especially if for some reason they deny you. In our case, we should have never received a manual action in the first place as the subdomains didn’t exist. We have a feeling something on Google’s end from a couple years back got pushed into the system somehow by accident.
This is also a good time to bring up the fact that you should always separate domains for crucial projects and clients. Subdomains can impact your main domain when it comes to SEO. Yes, this is kind of scary. Our new friend Glenn actually has a great article about how malware can cause huge SEO problems for your sites: the terrifying connection between malware, GSC, rogue subdomains.
Rankings Return
On February 24th, 2018 all the new content which we had been publishing since January 26th, 2018 started to rank. 🤘 This was four days before the reconsideration request was removed. So while it appears this was in fact the cause, it is still hard to 100% confirm this. Such is SEO.
We also checked many of our posts individually to confirm. Again, this is very easy to monitor with Accuranker.
We would definitely classify this as one of the craziest SEO issues we’ve ever seen, mainly for the following reasons:
- It only impacted our new content from a specific date and it wasn’t sitewide.
- We never received an email or message in the “message section” in Google Search Console regarding the manual action. And yet, we received notifications and emails just fine for the everything else.
- Manual action reports don’t contain dates which make it hard to pinpoint the issue. We really hope Google fixes this. They have the data.
- The manual action was for subdomains that hadn’t existed for over a year.
- First reconsideration request was rejected due to the domains not existing, so why the manual action? This made no sense to us.
The entire process above took almost a full month to resolve, and one we don’t want to soon repeat. All we can say is that SEO definitely keeps you on your feet! One bonus that did come out of the situation was that our organic traffic and keywords skyrocketed after the fact.
Other Commons Things to Check When Rankings Decline
Below are some additional common reasons why sites see declines in rankings and how to go about fixing them.
Technical SEO Issues
It’s very important to fix and resolve onsite and offsite technical SEO issues. This includes all sorts of things such as:
- Low word count on your content
- Titles are too long or too short
- Meta descriptions are too short. Take advantage of this space in SERPs to increase your CTR
- Meta descriptions are missing (How to add meta descriptions in WordPress)
- Missing H1 or H2 tags
- Multiple title tags
- Multiple H1 tags
- No internal links, no outgoing links, broken links or redirects
- Missing important social tags such as Open Graph and Twitter cards
- Misconfigured multilingual tags such as reciprocal hreflang (no return-tag)
- Broken images or images that are too big
The great thing is you can easily check your entire site at once. We recommend using a tool like Ahrefs for this. Their new site audit feature is awesome! SEMrush also has a similar audit feature which works great. Google also added basic SEO audits to their free Google Chrome Lighthouse extension.
A big part of this is also the performance of your site. This is where Kinsta can help!
Take advantage of our lightning-fast application, database, and managed WordPress hosting platform to instantly increase speeds across your entire site.
Spammy Backlinks
Backlinks are very important when it comes to SEO. But as with most things, you need quality vs quantity. Having thousands of backlinks from low-quality or spammy sites can end up hurting your site. Check out our in-depth tutorials on how to clean up negative SEO (yes, this does exist!) and how to fight back against content scraping. These include walking you through how to use Google’s disavow tool.
Penalties
First off, there could be a penalty from an algorithm update. Marie has an awesome, easy to read, list of all the algorithm updates. If your drop corresponds with an algorithm update, it doesn’t always mean that its a penalty against your site specifically, it could just very well be that after adjusting SERPs, other sites are ranking higher than you.
Second, you will want to avoid the following techniques as documented in the Google Search Essentials guidelines
- Automatically generated content
- Link schemes
- No original content
- Cloaking
- Sneaky redirects
- Hidden text or links
- Scraped content
- Doorway pages
- Abusing affiliate programs
- Keyword stuffing
- Abusing rich snippets or schema markup
Is your site hacked? This can also get you put in the penalty box, as seen above with our issue (even though ours wasn’t warranted). This isn’t always 100% accurate but you can check your site using Google’s safe browsing tool. Otherwise, you should get a manual action in Google Search Console under “Search Traffic → Manual Actions,” as well as a notification.
This is another area where Kinsta can be of assistance. We have hardware firewalls, active and passive security, and other advanced features to prevent access to your data. But beyond that, we offer free hack fixes for our clients. That’s right. Moving to Kinsta can help put your mind at ease long-term.
Low-Quality or Not Enough Content
If you’re seeing a constant decline in your rankings it could very well be that it’s simply because you have low-quality or not enough content. One of the quickest ways to check this is to pick one of your post’s subjects and enter it into Google. See how your competition stacks up. Let’s say you’re trying to rank something for “top wedding photographer.” Your page has 500 words and an image or two.
If we take a look at the first result that comes back, we see that it has 8,700 words, over 100 pictures, and is on a high-traffic site with high domain authority. You can probably safely assume, your 500-word article is probably not going to cut it. Also regarding quality, compare the readability of theirs to your own. Which one sounds more natural? We talk a lot about keyword rankings but never forget you need to write for the user first.
Misc. Tips
Here are some other things you might want to also check on your site.
- Check your robots.txt file and ensure nothing is being blocked from crawling. You can utilize the blocked resources tool and the robots.txt tester tool.
- Make sure you submit sitemap files within Google Search Console. While these aren’t technically required they are recommended to help Google see the structure of your site. It also provides you with more data to diagnose problems.
- Take advantage of the additional data in the new Google Search Console reports to spot patterns as far as 16 months back. You can easily check your average position and see how it changed over time.
- Always monitor your backlink profile. Perhaps you are losing a lot of high-quality links which is resulting in a decline in your rankings. Or maybe you disavowed a lot of popular high-authority backlinks by accident? This happens.
- Follow Google’s SEO guidelines and industry best practices to minimize the risk of losing your traffic or being penalized.
- Use reports like Accuranker’s Google Grump rating and Mozcast to get an idea if there are massive movements in SERPs happening. Sometimes this can be a sign of a new Google algorithm coming.
- Take advantage of the Panguin Tool from Barracuda Digital to see if your sites drop coincides with published algorithm updates.
And if you’re still scratching your head as to what might have gone wrong 🤔, make sure to also check out the awesome “why my web traffic dropped checklist” from Aleyda Solis.
Summary
Having tools in place like Accuranker, Ahrefs, and Google Search Console is incredibly important when it comes to diagnosing issues like these as fast as possible. And we can’t forget the awesome SEO community that jumped in to help.
For a lot of businesses, organic traffic is incredibly important. But this can also be a good reminder of why you shouldn’t put all your eggs in one basket. Advertise on Twitter, Pinterest and Linkedin, collaborate with other bloggers, launch an affiliate program, and utilize Google AdWords for quick results.
Do you have an SEO horror story? We’d love to hear it. Also, if you like us documenting how we solve these unique and complex issues, let us know below so we can keep sharing them!
Excellent post, Brian!
Back in my agency days, I filed a lot of reconsideration requests. As you found, not the most streamlined experience in the world. But I’m glad to see your ranking bounced back!
Thanks Brian!
Yes, the reconsideration request workflow has a lot of improvements that should be made. What threw us for a loop was getting a manual action for domains that didn’t exist. But as I know you already know… SEO is like playing a game of Clue sometimes. Nothing ever goes like you expect. At least we can say we’re never bored :)
Nice detailed Post,
My site was penalized for hacked content because of one subdomain.
I was using third-party email marketing software, and for tracking clicks, I have added subdomain tracking.domain.com
Maybe third-party email service was hacked and link to tracking.domain.com was pointing to some other site.
It was resolved after I removed subdomain and email marketing tracking.
Thanks for sharing Wasim! It is kind of scary how a subdomain being used by a different service could impact your main domain.
Definitely a reason to separate this stuff onto another domain… like getbrandname.com or something similar. Makes the setup process more difficult, but keeps you safe!
Hi,
written like a good thriller story.
Thx for all details and tips.
Haha, thanks! SEO is indeed a thrilling ride.
Nicely written and well explained.
Glad to hear that you were able to recover the rankings. I had a similar incident few years back when after a site migration traffic dropped due to a manual penalty. Nothing had changed apart from the domain name itself and everything was done as per the books – 301s, 50x etc etc.
In the end we went through a similar process and the reconsideration request worked.
Great to hear those well known members of the SEO community were more than willing to help out!
My concern in this situation would be if Google hadn’t detailed the Manual Action within GSC – it would’ve made it almost impossible to detect it (also good that you checked there, they should definitely notify you via email when this stuff happens!).
Hi Brian, very cool post – lots of detail and giving practical advice based on the real experience and real graphs is super powerful. You have mentioned lots of tools! At first, you used Accurancer and Ahrefs for tracking the rankings and keyword a specific page ranks for. I thought you must not have an SEMrush account and at the end of the article I did see the mention.
If you have an account with us, you can easily do the same things, especially with SEMrush Sensor that tracks the volatility of SERPs. There you’re even able to track how big are SERP changes for the keywords only you care about. Didn’t want to advertise the product here, but since you’re using it anyways, i thought i’d just mention a couple of use cases for what is exactly the topic of this post :)
Hey Olga, thanks for chiming in!
Yes, I love SEMrush as well. It’s a great tool. In fact, I have in-depth reviews of SEMrush on my personal marketing blog.
But we do prefer Accuranker for keyword tracking as they continually add features bigger tools don’t. The main reason is that they focus only on keyword tracking 24×7 :)
Hi Brian, thanks for sharing your experiences. This is an unusual (and frustrating) case!
Glad to read you were able to resolve it though. We wrote a post on diagnosing and recovering Google Rankings this week (https://www.contentkingapp.com/academy/ranking-drop/), and one of the steps there is about Google Penalties as well.
We found that you shouldn’t always trust Google to always report whether you were hacked. We’ve seen reports of Google not reporting that a website was hacked, while applying a manual penalty for cloaking or being a spam site. One account of a hacked website even got an algorithmic Penguin filter that obliterated their organic traffic.
Hey Steven! Great post. Yes, we’ve definitely learned that Google does make mistakes, as seen by our unwarranted penalty for domains that hadn’t existed for over a year.
Thanks u for nice article..
My experience it tooks some 3 months to recover malware in my sites and still my site get lower index because of that.
Yes it took time to recover and seem like seo problem like this need to be known before it happens to my site.
Sorry to hear that Ahmad. Malware can definitely cause a lot of harm to a website. Did you get a message via Google Search Console?
Awesome post and we use very similar strategies to find troubled websites. Then look to see whether the owner would like to sell their property.
Very much like buying a house rundown but you see the potential in renovating it. For affiliates / lead gen like my team carry out this is perfect.
Sooooo many websites out there are suffering from simple technical problems which could see jumps across the board in rankings.
This is one of best articles though I have read in a while and you are spot on here. Keep up the great work pal.
Thanks James! Agreed… technical SEO is very important. More than a lot of people realize.
Thank you very much for this detailed article, Brian. I wonder what did you write to Google in your second reconsideration request?
Hey Andrej,
Let’s just say I was a little more harsh in my second reconsideration request. ?
I think the Disavow Tool does not really do anything other than help feed the AI a list of manually found and identified low quality links from SEOs so it can train itself. You did mention you used it, so I was just chiming in on that.
It helps to teach the AI to find and identify low-quality and spam sites so it’s a win-win situation. It is a very useful tool and highly recommended to use if you found spammy links pointing to your site.