- Joined
- Apr 27, 2017
- Messages
- 27
- Likes
- 12
- Degree
- 0
Hello all. I have been scratching my head trying to figure this out for several days now, and thought that I would see if one of the gurus here have a solution. Over the last few weeks we have had 3 of our more popular pages get deindexed and GSC showing an incorrect 'noindex' detected in 'robots' meta tag error when the url is inspected. There is NO command to noindex the page on the page or in robots.txt file. Needless to say, this is costing us money and causing headaches.
Our marketing team implemented some Google Optimize A/B tests right around the first time that it happened, so it may or may not have something to do with it. Has anyone else run into this issue? Any ideas how to resolve it?
Our marketing team implemented some Google Optimize A/B tests right around the first time that it happened, so it may or may not have something to do with it. Has anyone else run into this issue? Any ideas how to resolve it?