It is common to see a drop in traffic on a website that has been performing very well. When such a drop occurs, it tends to be a cause for worry because it means that the revenue being generated on that site will also drop.
What is Responsible for Such a Drastic Drop in Traffic?
The most common possible reason is usually as a result of a search algorithm update. Whenever search engines roll out new updates, they often find errors or malpractices that lead to penalizing such sites.
Having a penalty on your site is one common reason that may cause such a drastic drop in traffic to your site. There are other reasons such as redirects, incorrect robot.txt rules, and a drop in ranking.
Whenever you experience a drop in traffic to your site, it is possible to check and identify what may be the cause of the problem. We will be looking at some issues that may cause a drop in website traffic.
Factors that Can Cause a Drop in Traffic
Every year, Google releases new algorithm updates. When these updates are rolled out, they do affect the rankings of sites. Some websites will gain more rankings, while others may lose theirs.
It is quite difficult to get the exact changes that were made by the algorithm, but if your site is negatively affected, you may use tools such as Mozcast or SEMRush to track changes that were made on your site.
Algoroo is another tool that you can use. In the case of a penalty, it is very important to check until you identify the problem and work towards fixing it.
1. When the search engine crawls your site, it will not be indexed if errors are found. Typical errors that can be found on your site include
- Errors from the server
- Redirect errors
- Having your URL blocked by robots.txt
- URLs that return unauthorized requests
- 404 errors
- URLs on which 404 pages cannot be located
- URLs that have a noindex tag
- Crawling errors
2. Loss of tracking code
As a webmaster, if you pull out the tracking code from your site, you should expect to lose tons of traffic. If you find out that Google analytics has not been recording the sessions, then it is possible that the tracking code is having a problem or may have been completely removed.
Since it is possible to fix the problem, you should endeavor to fix it on time, so you do not experience much loss in traffic.
3. Cannibalization of keywords
Such cannibalization occurs when a website shows several URLs for a keyword. If you have any new content that revolves around a specific topic without working on keyword targeting, it can result in keyword cannibalization. And this can lead to spreading your traffic across several pages which is a way of losing organic traffic.
The cannibalization report and BigMetrics.io can be used to highlight cannibalization errors. You can create an account with them and connect it to your Google Search Console. Then you export.
4. Redirect errors
There are many sites that have put redirects in place. If you have to add a 301 redirect on your site, you have to test it to ensure that it is functioning properly. Even if there are many of them, you should take your time to test them one after the other making sure that they are functioning as they should.
In the list mode of Screaming Frog, place the list of the URLs that have been redirected and crawl them. Find out if there are any corrections to be made and make them accordingly.
5. Incorrect robot.txt rules
Sometimes, it is possible to mistakenly leave files unchanged after you have migrated from a staging website. If it happens as such, the site can block the search engine from crawling the robots.txt file.
You will have to go to your site’s robots.txt file and make sure the disallow rule is not enabled.
6. Loss of ranking
Traffic on your website could decline if you lose your ranking. To track the changes that have occurred in your ranking, you can go through the Google Analytics and Search Console or use some other ranking tools to investigate when the traffic began to drop.
AccuRanker is a good tool to use. Export the ranking keywords before and after the drop to an Excel worksheet or G-sheet and paste the data side by side in the table. Compare the changes that have occurred and use research and mapping to retarget with the right keywords.
Using SISTRIX can help you identify keywords that have dropped even from the top-ranking pages.
7. Change in XML Sitemap
Changes in your XML sitemap can also result in a decline in traffic. On your sitemap, you should only allow URLs that can return a 200 OK response to be seen on your site map.
You can crawl the sitemap URLs to be sure that they all return a 200 OK response and so any new landing pages can be added. If you have as much as 200 URLs on your site and only about 50 are visible on your sitemap, you may want to regenerate more for your sitemap and resubmit it using the Search Console.
8. De-indexed URLs
Sometimes, the problem of important pages being removed from the search index may occur. This can result in a sharp drop in traffic to your site. To manage such problems, you need to check the index coverage report in the Search Console to see if any errors can be identified.
You can use the URL inspection tool to make sure that important pages can still be found in the index. If they are not found, you can use the Request Indexing option in the Search Console to get the pages back in the index.
10 Ways You Can Get Into Trouble with Google
Every webmaster understands the importance of being in Google’s good books. Google has played a very important role in the success of many businesses. So, when Google ranking is mentioned, all webmasters know that without it, their chances of bringing their online businesses to limelight are limited.
Likewise, they understand how a penalty from Google can hurt their websites for a very long time. Penalties can decrease their search rankings and negatively affect traffic to their sites. In the long run, their businesses and brands will lose a lot of customers.
Let us take a look at some things that webmasters do that get them into trouble with Google and how to avoid them.
1. Buying Links
Google does not support the buying of links. If you involve yourself in the link buying business, make sure that you do not get caught because once you are caught; you will land in trouble with Google.
There is no way that you will escape a penalty from Google if you get caught. The search engine prefers that you build natural links for your site. Many backlink companies will definitely assure you that their links will not be detected by Google as bought links, but that is really untrue.
Someday, Google will find you out and dish you some penalty. So, it is better for you to gradually build your links and get them from reliable sites.
2. Keyword Stuffing
Although using keywords in your content is really important, it is necessary that keywords should just constitute about 3% of your content. Google is against you stuffing your post with so many keywords especially when the post has highly irrelevant content that may not have much use to the audience.
Google’s algorithm update-BERT can understand the content and can tell a well-written content from the content that does not flow. Google is out there penalizing posts that contain so many unnecessary keywords misleading the audience to their sites. So, as much as keywords are important to you, you do not need to stuff your content with them.
3. Posting Shallow Content
There are so many freelancers online such that you can easily hire one to write for you. This is better than being caught with shallow content on your site. If one of Google’s algorithm-BERT discovers shallow content on your site, you will be penalized.
The effect of a penalty from Google is something that you will not like for your site, no matter how new or long-established your site is.
4. Duplication of Content
When Google finds out that you have content somewhere that is exactly the same as what is on your post, then there will be problems. First, Google will not index your content because it will not be able to tell exactly which one is the original content, and then it will bring down your site’s ranking.
5. Difficult-to-Navigate Pages
Some sites occupy their pages with so many ads that users have a hard time navigating through the pages. Sometimes, they may get so frustrated that they miss the content they seek for on your web pages all in the bid to avoid some of the ads.
They often leave the site without getting what they came for. This gives users a horrific experience on such sites. When there is so much content that will give users discomfort on your site, the search engine will likely penalize you.
6. A site with Low Authority and Trust
You are not supposed to have misleading content on your site or content that is not factual. Remember that the audience relies on your site for sincere and valuable information.
And when Google lists your content in search results, it is with the expectation that the audience will be able to find what they seek there. So, the search engine favors sites that can be relied upon for quality, relevant and valuable content. The kind of experience that the audience gets from your site can help increase or decrease your ranking.
7. Link Spamming
Many webmasters believe that having so many links on their sites will help them rank high. This can be true but not when your site becomes so spammed with irrelevant links that they begin to irritate visitors that come to your site.
Links are important but they should be relevant to your content. It should be that the links will improve the experience of the visitors like directing them to a site with similarly relevant content or take them to some further action to be taken. Avoid spamming your site with so many links to avoid getting penalized by Google.
8. Having Your Website Hacked
There are so many hackers online and Google makes an effort to protect online users from the activities of hackers and from hacked sites. If Google notices that your site has been hacked, it will put up notifications against the hacked site in search results, so users may be careful. This can bring down your site’s ranking very much.
This is a black hat SEO trick used by many websites through which they show a different website on Google search, and when the users click on it, they are redirected to a different site. Such tactics will give the users a negative experience.
Google tries to prevent this by penalizing websites that engage in such tricks. It is also meant to discourage other webmasters from such behavior.
10. Containing Harmful Inbound Links
Sometimes, inbound links to your site may contain dubious content or maybe links from spam and unreliable sites. Once Google discovers suspicious links to your site, they will also regard your site with suspicion, which makes it possible for them to bring it down. That is why it is important to know where inbound links have come from. You can use tools such as SEMRush or Moz Link Explorer to check up links to your site.
Webmasters do make frantic efforts to avoid a drop in website traffic. However, several problems or mistakes may occur that may lead to a drop in traffic. It is important that once you take note of such a drop, you quickly move to identify what has caused it and how it can be corrected, so your site can begin to regenerate traffic, quickly again.
Every month, Google hands out thousands of penalties to erring sites. Having your site penalized is going to harm you in a lot of ways. It can bring down your ranking and also reduce traffic to your site.
This will have a lot of impact on your earnings. So, having Google penalize your site is something that should be avoided. Maintaining your site without becoming an offender also tells a lot about your integrity and the authority of your site. So, it will do you a lot of good to avoid getting into trouble with Google.