Keyword Opportunity model - Get average PA/DA from SERPs

That's @NickEubanks's course. He's a member here. PA and DA are Moz.com metrics. You can utilize their Open Site Explorer and input a URL to get it. You can also utilize their API to get faster access within your program.
 
@CCarter Thx for your reply!

I am currently using scrapebox to get the SERPs and then get via scrapebox the moz metrics for each URL, but I cannot get them by keyword. Any suggestions for tooling?
 
@CCarter Thx for your reply!

I am currently using scrapebox to get the SERPs and then get via scrapebox the moz metrics for each URL, but I cannot get them by keyword. Any suggestions for tooling?

That's because PA/DA don't apply to keywords, I believe they use some kind of keyword difficulty score for that.
 
That's because PA/DA don't apply to keywords, I believe they use some kind of keyword difficulty score for that.

From my understanding, @NickEubanks scraped around 10 pages of the SERPs for one keyword. Then took the average from them in terms of DA and PA.

Any tooling suggestions?
 
From my understanding, @NickEubanks scraped around 10 pages of the SERPs for one keyword. Then took the average from them in terms of DA and PA.

Any tooling suggestions?
I'd just scrape it with Scrapebox then use Excel to crunch out an average if that's what you're looking to do!
 
From my understanding, @NickEubanks scraped around 10 pages of the SERPs for one keyword. Then took the average from them in terms of DA and PA.

Any tooling suggestions?
I use TermExplorer to get all the ranking URL's for each keyword-based SERP, and then use the Mozscape API to get DA/PA at the individual URL level.

In terms of building an accurate evaluation model there are definitely some additional metrics you need to take into account at this point, like link diversity ratio, link acquisition rate, and even a measure of the volatility of the SERP (which SERPwoo is excellent for gauging).

This is several years old at this point, but here was the initial concept math to define the discount rates for the DA/PA based model; https://docs.google.com/document/d/1yWQ3OUa_jgP6ltz9QoxWQjp5D1ieb-02y0bKqJx6y6I/edit?usp=sharing
 
That's because PA/DA don't apply to keywords, I believe they use some kind of keyword difficulty score for that.

From my understanding, @NickEubanks scraped around 10 pages of the SERPs for one keyword. Then took the average from them in terms of DA and PA.

Any tooling suggestions?

Download the Moz Chrome or Firefox extension. When you search on Google or Bing, it'll display the DA and PA for all results. Change your preference to show 100 results per page. Then choose "Export as CSV" and you'll have a CSV of all the top 100 listings for that keyword with their DA and PA.

Then use Excel to find the average DA and PA for the keyword..
 
The problem with this is that you can throw up a Wordpress site with nothing on it, let the Ping aggregators pick it up, then the Website Value and Alexa Scrapers all pick it up and link to it. Wait 2 months and you're magically already DA15 or so on Moz. Their metrics are complete garbage on the low-end. For some stupid reason they are capable of crawling those shit sites non-stop (probably their seed set) but they can't find Time.com links, etc.

If you're looking for low competition and you're using Moz metrics you're screwing yourself badly. A page might be DA15 with the best links on the internet, while another page is DA15 with zero links, and you'll never know the difference.

You need to combine Ahrefs, SEMRush, SerpWoo, Majestic, and anyone else with keyword difficulty metrics and then apply your own algorithm to weight the accuracy of each one to spit out a new final value. Otherwise you're wasting your time.

Honestly, bulk collection and analysis is a waste anyways. The high buying intent keywords should be obvious and few and you'll have time to actually analyze the top 10. Market Samurai is great for this. If you're looking for high traffic, low-intent keywords just for ad impressions, the competition is going to be too high to justify wasting the time. You'll never turn a profit in any reasonable amount of time or reasonable amount of money.
 
My first question would be, what are you working towards achieving?

I hate to make any generalizations, and this may not be correct for every niche, industry, or circumstance but, when it comes to this subject I feel like there are a few points that seem to be often missed.

The first one is Analysis vs. Forward Progress:

I would say in the majority of cases, the value curve between the two is logarithmically slanted against substantially higher focus on analysis. In a phrase, paralysis by analysis. I would say, more often than not, quick and efficient "ballpark" analysis and putting one foot in front of the other is going to net a better return.

The second is what I'll just call "Big Data For Small Fries":

I don't say that disparagingly, it's just the most concise way I can think to say it. In essence, through Excel, through manually aggregating individual metrics and individual types of data, significant time can be expended just trying to compile and derive insights from that data. It can end up becoming the average individual's version of "big data", with equally unmanageable pitfalls.

At the end of the day, it's a matter of cost vs. benefit. If you're systematically attacking keywords, intent on minimizing margin, maximizing profit, and truly dominating a niche, then by all means do so if the investment of time and resources makes sense. For the average person, and I'd say even the average aggressive marketer, dealing with things like Pareto's principle and ballpark numbers are probably going to net you a better return in terms of time saved that can be better devoted to actionable items.

If you do need the data and do need more in depth analysis, what I'd highly recommend is moving towards using a few third party data source API's (which ones depending on your goals, SEM vs. SEO vs. whatever else) to build a simple system that will let you visualize the data. 1-3 should be plenty for most needs. In other words, get your API connections, dump to your desired format, create a simple PHP/JS/whatever page that will display that data in a format that's useful to you, and then see what you can see. When you have a workable system down, then auto/semi-automate as much of it as you can. Manually pulling data is far too much of a time-consuming process.
 
My first question would be, what are you working towards achieving?

I hate to make any generalizations, and this may not be correct for every niche, industry, or circumstance but, when it comes to this subject I feel like there are a few points that seem to be often missed.

The first one is Analysis vs. Forward Progress:

I would say in the majority of cases, the value curve between the two is logarithmically slanted against substantially higher focus on analysis. In a phrase, paralysis by analysis. I would say, more often than not, quick and efficient "ballpark" analysis and putting one foot in front of the other is going to net a better return.

The second is what I'll just call "Big Data For Small Fries":

I don't say that disparagingly, it's just the most concise way I can think to say it. In essence, through Excel, through manually aggregating individual metrics and individual types of data, significant time can be expended just trying to compile and derive insights from that data. It can end up becoming the average individual's version of "big data", with equally unmanageable pitfalls.

At the end of the day, it's a matter of cost vs. benefit. If you're systematically attacking keywords, intent on minimizing margin, maximizing profit, and truly dominating a niche, then by all means do so if the investment of time and resources makes sense. For the average person, and I'd say even the average aggressive marketer, dealing with things like Pareto's principle and ballpark numbers are probably going to net you a better return in terms of time saved that can be better devoted to actionable items.

If you do need the data and do need more in depth analysis, what I'd highly recommend is moving towards using a few third party data source API's (which ones depending on your goals, SEM vs. SEO vs. whatever else) to build a simple system that will let you visualize the data. 1-3 should be plenty for most needs. In other words, get your API connections, dump to your desired format, create a simple PHP/JS/whatever page that will display that data in a format that's useful to you, and then see what you can see. When you have a workable system down, then auto/semi-automate as much of it as you can. Manually pulling data is far too much of a time-consuming process.
The problem with this is that you can throw up a Wordpress site with nothing on it, let the Ping aggregators pick it up, then the Website Value and Alexa Scrapers all pick it up and link to it. Wait 2 months and you're magically already DA15 or so on Moz. Their metrics are complete garbage on the low-end. For some stupid reason they are capable of crawling those shit sites non-stop (probably their seed set) but they can't find Time.com links, etc.

If you're looking for low competition and you're using Moz metrics you're screwing yourself badly. A page might be DA15 with the best links on the internet, while another page is DA15 with zero links, and you'll never know the difference.

You need to combine Ahrefs, SEMRush, SerpWoo, Majestic, and anyone else with keyword difficulty metrics and then apply your own algorithm to weight the accuracy of each one to spit out a new final value. Otherwise you're wasting your time.

Honestly, bulk collection and analysis is a waste anyways. The high buying intent keywords should be obvious and few and you'll have time to actually analyze the top 10. Market Samurai is great for this. If you're looking for high traffic, low-intent keywords just for ad impressions, the competition is going to be too high to justify wasting the time. You'll never turn a profit in any reasonable amount of time or reasonable amount of money.

SEO-Nick_SEO-Business-Model_Simple.jpg


Or you could just stick to the simple equation as shown above and then visit the SERPs for the keyword and judge the competitiveness for yourself.

There's no need to create complex equations with an API pulling data from several sources to answer the question "can I rank for this keyword?"

The answer can be obtained with much simpler questions such as:
1.) "is my SEO better than these sites on the top 10?"
2.) "do I have or can I create a site that are better than the sites in the top 10?"

If yes, rankings will be earned by:
1.) better optimization
2.) having a better resource which in turn results in more links, when promoted

This way is about a 30 minute path whereas the other way, creating a competitiveness calculation using multiple data sources, is just Analysis Paralysis, as @turbin3 said.
 
I should have prefaced my comments by stating that the scale I'm usually looking at things with is tens/hundreds of thousands of keywords, or more, often geo-segmented for each. If it's someone working with a dozen or couple dozen keywords, building a system like I described would be entirely unnecessary.
 
Back