I have more conversations about SEO metrics than I’d like to.
Not because I don’t love talking about SEO, but because there are so many half-baked metrics and more so, half baked reasons for using them…
“But my time on site is so high”
One conversation I run into constantly has to do with top-level measures of domain/page competitiveness and authority.
Call me old school, but I do still like to know domain and page PageRank (PR)…
Because I think they are still relevant when used as barometer.
Even though PR hasn’t been updated since December 6, 2013, I think it still offers some direction for getting a general sense of a site’s established authority.
It’s a relative measure that can be used as a directional metric – it’s not a pure play metric that you should be getting hung up on and it is ESPECIALLY not a metric that your clients should be obsessed with – or worse, track month over month as a signal of effective SEO.
Table of Contents
- What Is a Barometer Metric
- Where There’s Ambiguity
- Another Weakness
- Using The Right Tool For The Job
- How I Use Domain Authority
- Understanding Domain Authority
What Is a Barometer Metric
- An instrument for measuring atmospheric pressure, used especially in weather forecasting.
- Something that registers or responds to fluctuations; an indicator
bar′o·met′ric(băr′ə-mĕt′rĭk), bar′o·met′ri·cal adj.
For the purposes of this discussion I’m speaking specifically to definition #2; something that registers or responds to fluctuations; an indicator.
This is the important distinction.
This is also where Domain Authority is powerful.
As an indicator it is great in providing relative measures to assess both rank potential and downstream link equity.
Where There’s Ambiguity
The idea of using DA to gauge continuous improvement and “effectiveness of SEO” in the short-term is a recipe for disaster – where disaster is simultaneous frustration on both sides of the SEO table; from both clients and SEO providers.
Domain Authority is updated, at best, on a monthly basis furthermore – it’s really important to understand what the Mozscape index contains. Mozscape is a collection of URL’s that MOZ crawls and designates as fitting to be included in their index.
It IS a representative sample of trusted URL’s, chosen by MOZ.
It IS NOT a comprehensive collection of all URL’s that Google trusts, and uses to assess link value.
For this reason Mozscape link data should not be used as the sole source to:
- Track progress on link building campaigns.
- Report on month over month changes to SERP landscapes.
- Assess link velocity and determine rank potential timelines.
What’s actually been found here is that – none of the link graph tracking tools are sufficient, on their own.
But when used in concert together, you can start to get a more accurate representation of what’s going on.
As reported on in this sample study by Alexander Albuquerque, MOZscape had less that 10% of the total identified links when compared to Majestic and Ahrefs.
This study looked at link data from only 1,000 websites, looking at a max of 50,000 links per site for a total sample of 150,000,000 links – and while that may sound like a lot, relative to the internet; it’s not.
However, in my own personal experience I’ve seen similar patterns with MOZ missing the boat on reporting on large populations of links.
This really only becomes an issue if you’re leaning on MOZ to track progress on your link campaigns; since MOZ has such a small sample of sites – if you’re in any tangential or niche vertical not currently on their radar, new (and often powerful and important links) will not be counted or seen.
MOZ, while certainly not a static index – is not truly dynamic either, in that updates don’t happen in real-time (or even near real-time for that matter).
So if you’re using MOZ for rank tracking you may be missing the boat on accurately reporting on the true fluctuations that are happening in the SERP’s.
Furthermore, because Mozscapedoes not track links it does not include and therefore crawl, it cannot be used to track link velocity and decay – critical measures in assessing and scoring rank potential for target keywords.
Using The Right Tool For The Job
Building organic rankings is a process I often analogize as being very similar to constructing a building; you have the foundation, the supports, scaffolding, and then all the floors on top.
In the same way a hammer and nails wouldn’t be useful for pouring your concrete foundation, you need to use the right data to design and operate your SEO campaigns.
Once you’ve completed your keyword research and the process of compiling your competitive data to carve out your priority list – here’s how I recommend moving forward:
- Use a tool like Linkody to track new linking domains to your link building campaign targets as they come in.
- Use Serpwoo to track SERP volatility and gain insight into which rankings have the greatest day of day and week over week fluctuations.. This can often operate in parallel with your other research to quickly, visually aid you in identifying opportunities to crack into new SERPs.
- Use Ahrefs to track on-going link velocity and keep tabs on new links as they come in, opposed to Linkody, this is a report I would run weekly and monthly for macro level updates on link building effectiveness – and to show new LRD’s as they’re acquired.
The reason I recommend doing this specifically in conjunction with using Linkody is Ahrefs will help you build a sense of your “net new links” opposed to Linkody which will show you gross figures on new/lost links, some of which may drop off, get removed, who knows and not count toward your total rolling link equity.
With all that said, it might sound like I’m hating on DA – which I promise you is not the case.
I still use DA on a daily basis, but I use it for what it’s good for; directional measurement and planning.
For tracking against the following activities DA and PA are my preferred metrics:
- Snapshot quantification of SERP-specific rank potential
- Top-level qualification of new link opportunities
- Initial evaluation of websites for acquisition
Snapshot Quantification of Rank Potential
For this example I’m going to dissect the U.S. SERP for a product query. This is non-personalized (logged out), incognito, and includes 3 PLA’s and then image results, both of which I’ve removed for the purposes of this discussion.
The keyword is 22″ wheel spacers.
OK so lot’s to talk about in these results – I intentionally picked a SERP that contains a curious occurrence I’ve been seeing in more and more product results where the first option is the product on Amazon.
So if you examine the domain authority for each of these 9 results, you’ll notice that the lower maximum is 17 in position #7 (which we’ll come back to in a moment), but that there’s a DA18 ranking in position #3 right after YouTube.
If you look a bit closer you’ll notice that the site that’s ranking, that based on DA alone shouldn’t be ranking, is the manufacturer’s site of the product listed on Amazon, and more so the domain is the manufacturer’s brand (ICHIBA).
I’ve been seeing this more and more on product SERP’s, where it seems Google is giving preference to the manufacturer’s product pages even if they are comparatively weaker.
If you’re wondering what those percentages are that I’ve listed for each result they are the computed “link diversity ratio,” or the percentage of links that come from unique linking root domains.
Often times when I see sites ranking with much lower DA, they tend to have a much stronger ratio of linking root domains versus their competitors.
Both domains in this SERP with the lowest DA have the highest diversity ratios.
What also strikes me about this SERP is that it seems Google is still testing the keyword intent. It does show PLA’s, so G does consider this a commercial query, however considering there is a YouTube result, and 3 forum results – it seems they are still testing the potential for informational intent on this keyword.
Again, just keep in mind that the link counts we’re seeing are those driving the calculation of these sites DA/PA since they are only links from within the Mozscape index.
Based on all the results above I think the rank potential for this SERP is position 3 or 4; if you had a site with a DA18+ and were willing to build even 4 links to the page I think you could easily crack this SERP.
Continuing the discussion that domain authority as a barometer is extremely useful, in addition to flow-through metrics like Citation Flow (CF), Trust Flow (TF), and Topical Trust Flow (TTF), we still run our link building campaigns using DA as a qualification metric.
So for instance, if we’re designing a new link building campaign for an Ecommerce client with the goal being to increase net new linking root domains, the first qualifier we look at is a base DA, likely 30+.
This helps us put a value on the links and time we will spend building our target list, as well as create priority buckets for which sites and contacts should get a bit of extra love during the outreach phase.
Initial Evaluation of Websites for Acquisition
When buying websites for the purposes of SEO there’s a few considerations depending on your plans for the site. In general the most common uses are:
- Operate the site as a stand-alone entity, maintaing its existing content, rankings, and audience
- Fold in the website’s content, re-direct in the domain, and absorb all current rankings and traffic, or
- Re-direct in the top-level domain (TLD) to an existing website to absorb the full range of the link and trust signals.
For the purposes of this example I’m looking at scenario #3.
MOZ claims that when you use a 301 redirect from one domain to another that it passes ~90% of the re-directing site’s link equity.
But how does this affect net domain authority?
In my personal experience, I tend to see 10% to 12% of the domain authority from the site being re-directed into the final destination site get transferred to the final site; so if you’re re-directing in a DA45 site, you might expect to see a DA boost of ~5 points.
DA is a logarithmic measure between 0-100, so one way to conceptualize this is looking at in terms of a percentage scale, like 0% to 100%, where 100% represents the most trusted websites on the internet like Facebook, Amazon, and of course, Google.
The logarithmic nature means that it’s going to get harder to increase your DA as it increases; and that these increases will become exponentially harder as you move through the curve.
So it’s going to be easier in terms of time, links, and trust to move from a DA of 10 to 20 then it will be to go from 20 to 30.
Domain authority is a powerful, directional measure of trust and authority – but in cannot be used in a vacuum. There are other link strength and contextual signals that need to be considered to get a full picture of a domain’s ability to move the needle, rank on it’s own, or prop up other pages for rankings.