Is DA a good metric for link scouring?

Joined
Nov 5, 2014
Messages
831
Likes
616
Degree
3
I'm seeing a lot of people use Domain Authority (DA) as a metric when searching for backlinks. Like "Oh Site A's a DA50 domain and Site B's a DA60 domain... I should go for site B's link first before site A."

I'm also seeing link providers list DA (and only DA) as a metric.

What I don't get is WHY?!

Its clearly defined here https://moz.com/learn/seo/domain-authority that Domain Authority is a metric that shows how likely a site is to rank in the SERP. So, for example, if Site Y has a DA50 and Site X has DA60, a page on Site X should rank higher than a page on Site Y for a keyword if everything else is equal.

So, with that said, DA shouldn't be used to compare domains for potential backlinks. It should only be used to compare SERP competitiveness. (ie A link from a high DA domain doesn't mean it'll pass a lot of link juice; it only means that the domain has a high chance of ranking in the SERPs).

is this just me or does anyone else concur?
 
Moz says they take dozens of factors into account to determine the DA of a site. So it's moreso what the DA represents than the actual DA itself that people are interested in. What the DA resresenets is that site B has more positive ranking signals according to Moz than site A. Would you like a link from a website with more positive ranking signals according to Moz, or with less?
 
Moz says they take dozens of factors into account to determine the DA of a site. So it's moreso what the DA represents than the actual DA itself that people are interested in. What the DA resresenets is that site B has more positive ranking signals according to Moz than site A. Would you like a link from a website with more positive ranking signals according to Moz, or with less?
Ah got it. I myself am using TF/CF as that metric gives you an idea of what % of the links pointing to a site is spam, something DA doesn't show.
 
Moz sucks. They have the weakest crawlers among the other top players: Ahrefs and Majestic. Their metrics are very easy to manipulate, even with pure spam. Also they estimate highly blog comments! I've seen a lot of high DA sites with poor backlink profile with bunch of blog comments.

Just to give you a recent example. Before 2 months I started to build links to one of my domains. By that time the domain was DA12. In the course of few weeks I built something like 50 links from 50 domains. Some good some not so good. A wikipedia link, 2-3 dofollow DA80+ links, 2 guest post on average blogs and the rest was weak stuff (but not spammy) + some real social signals, in that time span I didn't lost any links or referring domains.

When they made their new update, the DA of the site was 9.

I personally like to check the TF of the domain, and after that I look at the link profile via ahrefs.
 
TF is usually more accurate than DA, but not always. At least TF is more sustainable to spam manipulations, this is for sure. What i like recently a lot, is Ahrefs' Domain Rank. It's the most accurate IMO.

But anyway any metric is just a mere indicator of whether you should spend time for manual backlinks investigation.
 
But anyway any metric is just a mere indicator of whether you should spend time for manual backlinks investigation.

Yeah. I agree as well. The question arose when I was in a job interview and the General Manager asked "How can you tell if a link is a good link."

I answered like you said, saying I would look at the domain's backlink profile and he wasn't satisfied with it. He said DA was a good way to tell, which to me didn't make sense as I'd rather go with TF/CF (Like @MeEatBrains said, a more rigorous metric).

During the interview, I couldn't figure out how to tell the GM he's full of shit without disqualifying myself for the job but, in the end, I didn't get the job (nor would I want to work at that startup) so I just should have ripped him anyways.

My experience with Domain Rank, ahref's calculation of a domain's PageRank, is that it's always the highest metric when compared to a similar metric from another provider (ie Domain MozRank or DA or Domain TF and/or CF). I also have several DA 40-50 sites that are vastly different in traffic and ranking ability.
 
I'm done with Moz metrics. Their crawlers are essentially non-existent. They might have half of a percent of what Ahrefs and Majestics crawl. They eventually catch up, sure, two years later. Ahrefs and Majestic are the only two players at this point. I'd cross reference both versus choosing one or the other.
 
I spent a bit on an expired domain because it was DA of 50 and PA of 55 with good trust flow. So I thought it was amazing.

Fast forward to now, it has dropped to DA of 27 and PA of 31.

So I based everything on moz metrics (which was dumb) but that's what I was told was the best to go with.

So what I learnt is rather go with ahrefs or TF.

But most of this is common knowledge, I just felt like voicing my experience haha
 
@built, the thing is, ALL metrics will deflate over time if you aren't continually adding links. This is because the size of the indexable web is growing exponentially. If all things remain equal on your side, your metrics just go down down down, but so does everyone else's.

It's like what @j a m e s said above, these should be used as relative metrics, not static. At any given time, you need to be comparing DA versus DA.

For instance, a PR 3 link of yesteryear might be PR 1 now, but still have the exact same power. The more pages that are published, the more juice there is flowing and the logarithmic scale of all of these metrics has to keep sliding down.
 
@built, the thing is, ALL metrics will deflate over time if you aren't continually adding links. This is because the size of the indexable web is growing exponentially. If all things remain equal on your side, your metrics just go down down down, but so does everyone else's.

It's like what @j a m e s said above, these should be used as relative metrics, not static. At any given time, you need to be comparing DA versus DA.

For instance, a PR 3 link of yesteryear might be PR 1 now, but still have the exact same power. The more pages that are published, the more juice there is flowing and the logarithmic scale of all of these metrics has to keep sliding down.
So does this mean that competition gets tougher and tougher at the same time?
 
S2jtRRN.jpg

Look at the top graph first, the linear graph with a linear y-axis (juice) and an exponential curve (number of pages at that level of juice) on the graph. (forget the x-axis altogether, I didn't photo-chop it out).

It just means that you have more and more pages being created at the bottom of the y-axis (at zero). They are the source of all page rank and it flows upwards.

They will link to each other and to pre-existing pages, constantly redistributing the page rank through the link graph that is growing and emphasizing different things.

So what happens is, the number of PR n/a pages grow and start linking to each other, causing the number of PR 1 pages to grow and so forth. They still link to pre-existing pages as well, but more of the growing amount of juice in existence is being created and added to newly created pages. However, the bulk of it ends up flowing to "popular pages" which still link to even more popular pages. So you get this squeezing of the middle class, basically.

So in the top graph above, you have a linear y-axis and a logarithmic curve. But the way these metrics work on the y-axis is that THEY are logarithmic, which renders the curve linear, like bottom graph. Instead of 1, 10, 100, 1000, 10000, they just relabel it 1, 2, 3, 4, 5 but it represents the same exponential growth (these numbers actually represent the exponents).

0 = 10^n/a
1 = 10^0
10 = 10^1
100 = 10^2
1,000 = 10^3

And so on. So the PR is actually the # furthest to the right. The exponent. What happens is you get more and more pages being created at the n/a level and maybe growing to 1 and 2, but most of those don't link to each other, they link to pre-existing pages and usually the ones already much higher up the scale at 7, 8 , 9 , 10. so 3, 4, 5, 6 start getting squeezed out and lowered, not in power but just in LABEL. They still hold the same power. But the top dogs start getting more and more power with the same label.

Middle = same power with lower label
Top = more power with same label

It's confusing. It's very much like how the economy works. Get in early and be visible, get clout and continue to reap the attention from the new comers (pages being published, consumers growing up and getting jobs) and you keep growing. The middle guys who are doing just enough to be above the lower but nothing special enough to boost to the top have to keep working harder and longer to make the leap. They have visibility but not enough.

Marketing = gaining visibility.

This is the problem Google has. How do they lower the power of the top dogs so that the middle people can get some love too, without allowing entry for the bottom-rung spammers to get seen? Their answer was focus less on page rank and more on branding, which spammers don't take the time to do.

This is why we advocate building brands and focusing on one project. How do you expect to get enough attention to 10 sites when it's hard enough to pop one over the threshold?
 
To add on to what Ryuzaki said, PageRank is calculated in iterations. In the first iteration, GoogleBot is dropped randomly across the web and crawls 6.66 steps out. When it's done, it gets dropped randomly again and crawls for another 6.66 steps and so forth. The more often GoogleBot crosses its path, the higher that page's PageRank is (or site's PageRank if domain PageRank is being calculated). So, you can say that PageRank is the probability of someone randomly stumbling on a page when surfing the web.

The interesting this is that, on the 2nd, 3rd, 4th, and all future iterations of the calculation, GoogleBot is dropped based upon how likely one is to come across a page when surfing the web! Yup! On the second and greater iterations, GoogleBot is dropped based upon PageRank. The higher a page's PageRank, the higher the likelihood it'll be the starting point of a crawl. This further increases the "rich get richer" problem. It also gives good motivation to take action now as the first comer's advantage is pretty awesome.
 
It's not just based on probability of being encountered from a random starting place. That doesn't take into account the juice of a current link node and how much it passes to the next node. This random drop concept with such few steps doesn't take into account the fact that there is a literal link graph.

You can receive a PR8 link (page not domain) and become PR6 for instance with one link. The probability of being encountered is very low, even if the PR8 page has a high probability of being encountered. What you're saying fundamentally makes sense. In the beginning its about the number of votes, but after the first iteration (which happened in or before 1998) it's been about who votes for you.

Regarding crawling, they may not take more than 6 steps into a single domain (which is why sitemaps are a good idea as well as interlinking) if they don't encounter an external link, but the spiders don't just stop and land in some random place either. They try to sew up all intersections, which is why it's a web. It can't be one bot crawling a linear path either or any number crawling linearly. Each one likely spawns off another at every external link and will crawl for an infinite number of steps unless it hits a dead end or too many internal link steps without finding an exit path.

Not only do they start crawling from high Page Rank pages, high Trust Rank pages, but also from the bottom-most rung too... like Pingomatic and Google Analytics data. You'd have to crawl from the bottom-rung more than any other or you end up in a circle-jerk with no way out. Big sites don't link to small sites very often. The big sites are the end of the road.

f-d%3A0c21f474daf72a69f6fab4a264b4186484b140a9d19dfd60359a0e65%2BIMAGE%2BIMAGE.1


All paths lead to Rome. You can't crawl backwards.

Every single metric EXCEPT Trust Flow works this way. Moz, Majestic, Ahrefs, and all of them do this same thing, and the reason some are more accurate than others depends entirely on the resources they are willing to spend to spider, build a link graph, and keep a giant index.

Trust Flow (and of course, this is all highly speculative but logical) starts crawling not from high page rank pages but from the most authoritative and trustworthy pages possible, such as government, educational, scientific journals, etc. Then they crawl out and measure the number of steps required to find your site, and again apply a logarithmic calculation to it. Citation Flow is Majestic's version of Page Rank. Trust Flow is their version of Google's Trust Rank.

They tell two different stories which is why they aren't comparable. We know Google also wants to measure Author Rank, but is flip-flopping on it with Google+. I'm sure Author Rank also is closely related to some kind of Social Rank too, based on social signals.
 
I can't believe people still base decisions from just a single metric.

Depending on what the intent and goal(s) are, it's typically a good idea to look at multiple metrics, when assessing the quality, stability, relevance, and safety of a particular domain or page.
 
I can't believe people still base decisions from just a single metric.

Depending on what the intent and goal(s) are, it's typically a good idea to look at multiple metrics, when assessing the quality, stability, relevance, and safety of a particular domain or page.

This!
Even though Semrush organic traffic data is currently my favorite data since it's the best tool to filter out penalized domains and a direct proxy of the value Google's putting in that domain/page, it's not an panacea either since it can miss poorly optimized domains/pages with still a lot of juice.

TBH, I don't use any metrics like DA/TF and all that crap (saw someone mentioning Ahrefs Domain Rank - LMAO!). Just use direct ranking data and manually overlook the backlink profile. Once you get some experience you'll see within 30 seconds wether a link on that domain is worth it.

Just a case example: authorstream.com was once a great domain with 200k monthly traffic (mid-2010). Even though the 3rd party metrics (DA 68, TF 60) are still great traffic dropped to 7k. Why? Cause Google penalized the domain. If you checked these crappy 3rd party metrics or even the backlink profile you'd never notice. Semrush FTW!
 
I can't believe people still base decisions from just a single metric.

Depending on what the intent and goal(s) are, it's typically a good idea to look at multiple metrics, when assessing the quality, stability, relevance, and safety of a particular domain or page.

I fully agree with @turbin3. I use Ahrefs and Majestic, but if I am building links to a money site I 100% manually look at the backlink profile of a domain before I build/pay for a thing. Some things I look for in a good link:
  • Domain Age
  • Link Velocity
  • Anchor Text
  • Topic Relevance
  • On-Site SEO
  • Trust Signals
Also take a look at the site, does it look like a spammy PBN that has a ton of random posts that are not niche related? Does the site have all the "Big Brand" pages (Privacy policy, ToS, Contact Page, etc)?
 
I fully agree with @turbin3. I use Ahrefs and Majestic, but if I am building links to a money site I 100% manually look at the backlink profile of a domain before I build/pay for a thing. Some things I look for in a good link:
  • Domain Age
  • Link Velocity
  • Anchor Text
  • Topic Relevance
  • On-Site SEO
  • Trust Signals
Also take a look at the site, does it look like a spammy PBN that has a ton of random posts that are not niche related? Does the site have all the "Big Brand" pages (Privacy policy, ToS, Contact Page, etc)?
Question: if you had to choose between Ahrefs and Majestic, which would you go with? Guess that kind of goes against what @turbin3 said about going with one metric, but these tools are expensive lol
 
Question: if you had to choose between Ahrefs and Majestic, which would you go with? Guess that kind of goes against what @turbin3 said about going with one metric, but these tools are expensive lol

Ahrefs is way better - especially in picking up spammy backlinks from tools like GSA SER & Xrumer. In fact Majestic picks up only 10-20% of these links compared to Ahrefs.
 
Ahrefs is way better - especially in picking up spammy backlinks from tools like GSA SER & Xrumer. In fact Majestic picks up only 10-20% of these links compared to Ahrefs.

Wouldn't that make Majestic better? When you're looking up a competitor's backlinks, you don't want to spend all day sifting out forum links.

It depends on what you use it for I'm guessing.

(edit: I also think that Majestic didn't include those links in their index for a reason... but that's a total guess and I might be wrong.)
 
@Philip J. Fry You want to see the spammy low quality links if you're trying to determine whether you should aim for a link from that site or not. Say you're looking at BobsBroomGallery.com and PhilsBroomWorld.org. Both have similar metrics and similar looking backlink profiles in ahrefs. You open up Phil's site in Majestic and see that Phil has 1000 spammy GSA links in addition to everything else. That's good to know, ya? Also maybe those links are the reason your competitor is ranking, and ahrefs is just going to keep you blind to that if they aren't showing them. It would be nice to know if your competitors are ranking due to a strong, long-term backlink profile or spam that might or might not keep them there for very long.
 
Question: if you had to choose between Ahrefs and Majestic, which would you go with? Guess that kind of goes against what @turbin3 said about going with one metric, but these tools are expensive lol

The answer is, it depends. It depends a bit on what your intent is, your niche, etc. For example, for me, I'm typically not dealing in affiliate niches with my "day job", and focused more on big brand, enterprise-level stuff. For that, Majestic and Link Research Tools hands down; Majestic if I can only pick one. LRT is pricey, and may not make sense for most people doing affiliate stuff, especially if they're small fry niches. If someone is focused on a niche/business that's more SEM-focused, maybe SEMrush might make the most sense. As they say, there's not one way, but many ways.
 
I don't get this concept of...
  • Site A - 100 good links, 100 spam links
  • Site B - 100 good links
"I'd rather have a link from Site B."

I'd rather have a link from both. Those 100 good links are shielding the negative affect of the spam links and possibly turning the spam into links that give a positive effect. And that should be your goal too. Get enough good links that direct spam doesn't even matter and ends up boosting you till Google deflates it. And when they deflate it, you have enough trust that it doesn't hurt you.

Build up enough trust and authority and Google doesn't care what the rest of the internet slings at you. They know that all kinds of automation and stupidity is going on. It has to come at you at huge volumes or be "sneaky" like a PBN to hurt you. Who's going to build a PBN in order to let it get caught and hurt your site? Nobody.

The best way to get this trust and authority is to do your on-page SEO and forget about off-page for the most part. Just get out there and be a marketer. And if you do that right, you won't even care about Google because you'll be making too much money from social networks, direct traffic, and the rest. That's when Google starts to reward you. They actually are good at rewarding the sites worth rewarding. The SEO battle has always been "how do I trick Google."

The question should be "how do I get Google to reward me based on actual merit." The best way to do that is to forget about Google, unless it's Adwords.

Of course that's pretty black and white. I'm not saying that if you have a #4 ranking you shouldn't try to bump it to #3. But an off-page SEO-focused lifestyle is becoming increasingly less valuable and goofy.

Here's what happened with a project recently when I stopped building links and started earning links:

t4J1x3c.png


Instead of trying to sneak my way up the SERPs for visibility, I just went and made myself visible where the people are at. You can't just drop your link on Page 1 of Google. But you CAN do that on Facebook, Reddit, Pinterest, Forums, etc. And that's when people started linking to me in droves. Instead of making 100 posts, all 500 words, blah blah, I made one optimized, killer post over the course of a few days, then I made it visible.

Marketing is King.
Content is Queen.
Social is all 4 Aces.
Money is God.
Links are Jacks.
SEO's are Jokers.

To get back to the main point, lol, if someone's willing to link to me, I want it. If it's organic, I want it. I don't care what domain it ends up on. Continue down this path and you're shielded from whether or not there's a few 1000 spam links on a site that links to you. Matter of fact, you're shielded from that anyways, as long as it's not some silly network or tiered Web 2.0 link wheel.
 

I don't usually like to generalize or make absolute statements too much, but what Ryuzaki said is the goddamn truth.

I'll say from personal experience, having helped major international and even enterprise-level brands with penalty removal efforts, link building strategies, etc. there is a point of substantially diminishing returns with some old and rigid ways of thinking in regards to links and link building. I have quite literally, manually, looked through hundreds of millions of links and metrics for them. I've done it so much to the point where my eyes bleed, and the metrics and sense of the nature of a link has been incorporated into my subconscious. I'm no one special (just stubborn, persistent, or too stupid to quit lol), and it's not my intent to say that, just to preface this. Despite all of that, and gaining the ability to glance at spreadsheets of rows of metrics and subconsciously determine value, risk, etc. like it's second nature.......I wish I spent a more significant amount of that time and experience in actually MARKETING. I wish I spent more of that time on learning to understand my target audience, their wants, their dislikes, the messaging that appeals to them, the content they prefer.

There is still a time and place for off-site SEO, in terms of link prospecting and building. That being said, things are quickly getting to the point where, if you're going to spend much time on that, spend it on developing a self-sustaining system that will auto/semi-automate all of the manual labor. For example, an enterprising individual might decide to learn more about SVM's (support vector machines), machine-learning in general, test a lot, and eventually come up with a method of identifying the types of links they want. After that:
  • Dump in seed lists
  • Press button
  • Collect links
Plug those links into your favorite outreach program, dump them in a list for VA's, or do whatever you do to scale all of that efficiently. Take it a step further, integrate your SVM setup with third party APIs (Majestic or whoever else) to regularly dump whatever types of links you go after on a preconfigured schedule, and automate the rest of the process. Then you can spend all of that saved time on more important endeavors, like market research, competitor research, content strategy, etc.
 
Question: if you had to choose between Ahrefs and Majestic, which would you go with? Guess that kind of goes against what @turbin3 said about going with one metric, but these tools are expensive lol
To be honest I can't really say. Guess its about time for @CCarter to come show someone how its done :wink:
 
Without reading all of the other commentary the reason you are seeing DA is the fact that page rank has not been updated and most likely will never be again so the newest and closest method we can look at is domain authority from Moz. Since your from the Future"rama" maybe you can help develop the next best solution?
 
Back