Google for partial and full website de-indexing

Google’s miller answered a quest from someone whose website was de-indexed and whose ranking was lost. John Mueller offered a list of technical issues that could cause Google to remove a website from search results.

The good thing about this question and answer is that Mueller discusses two types of deindexing, slow deindexing and faster deindexing.

Office hour SEO hangouts are not the place to get a diagnosis for a particular website. So it’s reasonable that Miller didn’t give the person who asked the question a direct answer on their website.

Improved Yoast SEO from Free to Premium and Lost Rankings

This is the question that was asked:


Read on below

“I own a website that ranked well prior to March 23rd. I upgraded from Yoast SEO … for free to Premium. After that, the website was de-indexed by Google and we lost all of our keywords. “

The person who asked the question found that the keywords had returned to search results for a few hours in the past few days and then disappeared.

They said they checked Robots.txt and checked the sitemaps and made sure there were no manual penalties.

One thing he didn’t mention was to check if the web pages had a Robots Noindex meta tag.

Mueller claims the Yoast plugin wasn’t the reason the site was de-indexed

Google’s miller begins his answer by speculating that the deindexing is not associated with updating the Yoast plugin from the free version to the premium version.


Read on below

I think it’s reasonable to start with the Yoast plugin and look at the settings. It happened to me that I installed the Yoast SEO plugin and then noticed that pages somehow got a meta description “noindex, follow”.

I have no idea what caused this, I just noticed it happened.

In my experience, it is a good practice to refuse nothing as a reason without first checking.

So, I don’t have to agree to refuse the Yoast SEO plugin upgrade as a cause without checking it out before ruling it out.

Müller replied:

“I don’t know … it sounds kind of tricky … I would say without further ado that it probably has nothing to do with updating your plugin.”

Google’s John Mueller answering a question about deindexing

John Mueller discusses various ways that Google removes websites from their index

Why Google Deindexes Websites

Next, Mueller offers insights into the deindexing process, including a long deindexing scenario where parts of a website are slowly being deindexed because Google doesn’t think they are relevant.


Read on below

Mueller discusses slow partial deindexing

Mueller next discusses slow deindexing of parts of the site, but not the entire site. What he describes next is a partial de-indexing.


“But it could well be a technical problem somewhere.

Because usually … when we reduce a website’s indexing, when we say we don’t need to index that many URLs from a website, we usually keep the … URLs that are more relevant to that website and that are more like something This happens over … I don’t know … over this extended period of time, in which the indexing changes slowly. “


Read on below

What the person asking the question wasn’t slow or partial de-indexing. His problem is complete site de-indexing.

John Mueller explains full site de-indexing

Next, Mueller described the possible reason a site might be completely de-indexed.


“So when you see something that makes the entire website go away from indexing, it almost sounds like something related to a technical problem … something along those lines.”

Next, Mueller recommends consulting the webmaster’s help forums for help diagnosing the specific problem. This is unsuitable for the Google SEO Office hour hangout, but should be queried in the Google forums.


Read on below

Mueller suggests that it could be a technical problem, a website quality problem, a spam problem, or possibly a hacking event.

Many types and reasons for deindexing events

When a site is de-indexed, it is good to not only check the Robots.txt file but also to check the source code of each page itself to make sure there is no rogue noindex meta description file preventing Google from doing so to index the web page.

There are many reasons a site could be de-indexed beyond the accidental robots.txt and robots meta tag, as Mueller noted. Reasons like a hacking event or other technical issue that Google might be blocking should be investigated and nothing should be ruled out until verified.


Read on below

Other than that, the information on slow partial de-indexing and overall website de-indexing was good information on how Google de-indexes websites.


See John Mueller answer why websites are being deindexed.

He answers the question about the seven-minute mark.

Comments are closed.