- Joined
- Sep 3, 2014
- Messages
- 6,230
- Likes
- 13,100
- Degree
- 9
Source: https://webmasters.googleblog.com/2019/07/a-note-on-unsupported-rules-in-robotstxt.html
Heads up. There's a bit of bad information that floats around about using 'noindex' in your robots.txt file. Lots of people do it, and while it was never an officially supported directive for that file, sometimes Google would respect it as would other crawlers.
Google, in an attempt to start unifying and getting everyone on the same page with robots.txt, is open sourcing their parser for it.
This also coincides with them completely dropping support for noindex in the robots.txt. So if this is you, then you're going to need to use one of the other acceptable methods to keep a page out of the index:
Why this matters? Panda.
Heads up. There's a bit of bad information that floats around about using 'noindex' in your robots.txt file. Lots of people do it, and while it was never an officially supported directive for that file, sometimes Google would respect it as would other crawlers.
Google, in an attempt to start unifying and getting everyone on the same page with robots.txt, is open sourcing their parser for it.
This also coincides with them completely dropping support for noindex in the robots.txt. So if this is you, then you're going to need to use one of the other acceptable methods to keep a page out of the index:
- Noindex in robots meta tags
- 404 and 410 HTTP status codes
- Password protection
- Disallow in robots.txt
- Search Console Remove URL tool
Why this matters? Panda.