Pages getting deindexed and GSC incorrectly showing 'noindex' detected in 'robots' meta tag

Joined
Apr 27, 2017
Messages
27
Likes
12
Degree
0
Hello all. I have been scratching my head trying to figure this out for several days now, and thought that I would see if one of the gurus here have a solution. Over the last few weeks we have had 3 of our more popular pages get deindexed and GSC showing an incorrect 'noindex' detected in 'robots' meta tag error when the url is inspected. There is NO command to noindex the page on the page or in robots.txt file. Needless to say, this is costing us money and causing headaches.

Our marketing team implemented some Google Optimize A/B tests right around the first time that it happened, so it may or may not have something to do with it. Has anyone else run into this issue? Any ideas how to resolve it?
 
It's possible that those pages are giving off a noindex in the X-Robots-Tag in the HTTP headers somehow. Perhaps a developer messed up (or Google Optimize) and is adding that in.

Those are sent through the .htaccess or http.conf files on Apache servers or the .conf file on Nginx.

It can also be added through PHP, for example, and any one sensible would be adding this at the top of the files, so it should be easy to find if in there. This would not render on the page so I wouldn't assume it's not there because you can't find it in the source code.

That's a good avenue for investigation. I'm not sure that Search Console would list the X-Robots-Tag as a meta tag, but who knows. They've done stranger things to accommodate edge cases.
 
@stymie, did you or will you let us know how this panned out once you get it fixed. I'm really interested in knowing what, where, why, and how this happened. Any technical SEO problems like this need to be tackled ASAP, and if we have an extra lead / solution on the problem here, it could save some of us a ton of time and money.
 
Back