animalstyle
BuSo Pro
- Joined
- Mar 28, 2015
- Messages
- 930
- Likes
- 842
- Degree
- 3
I have a handful of pages that I've set to disallow in my robots.txt. For example I've blocked the page where a user leaves a review for a specific location in my database.
Now obviously I have to link to this page so people can write reviews. The page is populated dynamically based on the link you click from each individual page. I don't want people landing on these pages from the search results.
I've just had a bunch of crawl errors for these pages pop up in search console.
My question is:
Should I be rel=nofollowing the internal links to these (and similar) pages?
Now obviously I have to link to this page so people can write reviews. The page is populated dynamically based on the link you click from each individual page. I don't want people landing on these pages from the search results.
I've just had a bunch of crawl errors for these pages pop up in search console.
My question is:
Should I be rel=nofollowing the internal links to these (and similar) pages?