- Joined
- Nov 6, 2014
- Messages
- 40
- Likes
- 77
- Degree
- 0
@Rob Lowem touched upon this in another thread but I've been meaning to discuss this for ages - so here we go.
I've been mining GSC for keywords previously but I've seen the true benefit of it after publishing really long articles and playing with news syndication.
Long articles pick up tons of long-tail keywords, and here's how they look in GSC (section Search Traffic -> Search Analytics, sort by Impressions, old view):
All these keywords have great potential - because if your site is getting 80 impressions for the KW and you're only on Page 5, the traffic you'd get on Page 1 is probably decent.
Next, getting to Page 1 is usually not hard, because all it takes is a separate article on this keyword. When long-form content ranks for a bunch of longtails, some of them are just mentioned on the page once, and not even as an exact match. So once you create a separate post for a keyword like that and get some internal links to it - it shoots up fast, because competition is low.
And obviously, most keyword tools will just show you 0 search frequency for all these queries. Neither you will be able to scrape these keywords any other way.
Another way of harvesting these is by re-publishing news from a large news outlet covering your niche. Syndicate an RSS feed to publish 3-5 articles per day and in a couple of weeks you'll get a completely new vision of the traffic in your niche and get tons of ideas.
You can go one step further and set up a separate domain syndicating news and harvesting keyword data for your main project.
This whole thing is just another reminder for those who tend to overthink: you need to start fast. The sooner you start your website the more real practical data you will have in a year.
Another benefit to it is that in your webmaster console, you'll end up with a lot of keywords where you rank in the top 100 for with their impressions. This is gold for finding high traffic keywords you would never have dreamt of.
I've been mining GSC for keywords previously but I've seen the true benefit of it after publishing really long articles and playing with news syndication.
Long articles pick up tons of long-tail keywords, and here's how they look in GSC (section Search Traffic -> Search Analytics, sort by Impressions, old view):
All these keywords have great potential - because if your site is getting 80 impressions for the KW and you're only on Page 5, the traffic you'd get on Page 1 is probably decent.
Next, getting to Page 1 is usually not hard, because all it takes is a separate article on this keyword. When long-form content ranks for a bunch of longtails, some of them are just mentioned on the page once, and not even as an exact match. So once you create a separate post for a keyword like that and get some internal links to it - it shoots up fast, because competition is low.
And obviously, most keyword tools will just show you 0 search frequency for all these queries. Neither you will be able to scrape these keywords any other way.
Another way of harvesting these is by re-publishing news from a large news outlet covering your niche. Syndicate an RSS feed to publish 3-5 articles per day and in a couple of weeks you'll get a completely new vision of the traffic in your niche and get tons of ideas.
You can go one step further and set up a separate domain syndicating news and harvesting keyword data for your main project.
This whole thing is just another reminder for those who tend to overthink: you need to start fast. The sooner you start your website the more real practical data you will have in a year.