Forrester Research projects U.S. consumers will spend $327 Billion online in 2016. With Google seeing approximately 7.1 Billion searches per day globally* (between mobile, desktop, and tablets) we can approximate that every single query in the U.S. is worth $1.26 (based on U.S. representing 10% of world’s internet users).
*Update: thank you Rand Fishkin for pointing out the original stat was outdated.
In a recent study by BrightEdge, SEO accounts for as much as 51% of the traffic being driven to B2B and B2C pages, re-enforcing that SEO is far from dead – and continues to offer a long-term, sustainable ROI channel.
So, why are we talking about SEO rank potential? Regardless of who you need to pitch SEO to : your boss, your client or business partner, it will all come down to one question: How much time and money will this cost me?
Prior to launching any SEO campaign, you need to know the resources needed to achieve your business goals. That way you can realistically answer the question.
It’s generally a bad idea to pitch an SEO campaign without first researching the keywords you’re going to be targeting, and understanding who you’re competing with… if you bet the farm on unattainable keywords, you’re gonna have a bad time.
So What is Rank Potential?
Rank potential is an analytical approach to understanding where a given webpage can actually rank in organic search, with respect to two axes of consideration; time and investment*.
*that’s my definition
It’s not realistic to project that you’re going to outrank a Wikipedia page for an informational query, or a Fortune 500 brand for their brand or branded product name – at least not without significant investment, if ever.
My approach is to analyze a page’s rank potential based on the qualitative SEO metrics for the current top 10 ranking URL’s (what I will refer to as the search engine results page 1, or SERP 1).
The metrics I analyze are:
- Number of Links
- Number of Linking Root Domains
- Trust Flow
- Citation Flow
- Domain Authority
- Page Authority
In addition to this core set I also will evaluate additional relative measures of authority including Domain Age, Organic Difficulty, Link Strength, Brand Footprint, and Social Media Impact.
Now I promise this is far from a beginner post, but to ensure you’re thinking of the base metrics the same way I do, I’m going to run through a quick and dirty description from my perspective.
My Perception of the Metrics
Number of Links
More importantly than just the pure number of links, this metric is used in conjunction with the number of unique linking root domains to determine diversity ratio. Organically authoritative websites have high link diversity ratios (LDR – links from many unique, authoritative root domains) versus gobs of links from the same 5 websites, likely owned by the same person.
In addition to the number of links as a consideration for LDR, it is also important to look at link velocity. If a site is picking up tons of links very quickly there should be an obvious cause such as press coverage, product launch, a new partnership, or a positive mention on a very large publication. If not, this is suspect that darker activities are afoot.
Number of Linking Root Domains
As mentioned above, this is generally a sound measure of the organic authority of a website. Like everything else in SEO there are always exceptions to the rule, but generally websites that maintain an organic link profile should have anywhere from a 10 – 30% diversity ratio.Which means 1 linking root domain for every 3.3 – 10 indexed links.
This metric goes hand in hand with the latter metric, Citation Flow (CF), but for the purposes of this post I will try to describe here. This measure, like CF, was developed by Majestic, and is a relative measure of how trustworthy the link is.Majestic has built a manually reviewed index of trusted websites and use a general rule they developed from a manual analysis of a representative sample of the URL’s included.The rule is based on there finding that:
trustworthy sites tend to link to trustworthy neighbors – Dixon Jones, Founder of Majestic SEO
Trust Flow (TF) is a logarithmic scale from 0 to 100. In my experience you should stay away from links with a TF under 25.
Citation Flow is a predictive measure of how much influence a given URL is likely to pass on to links that it points to.
With specific respect to link juice, URL’s with higher CF are more likely to pass more of that influence downstream to the URL’s they link out to.
The practical application of this is links from pages with higher CF will send greater positive signaling than their weaker alternatives (generally below 15 is suspect).
Domain Authority is an SEO KPI created by MOZ, and represents a relative measure of a domain’s authoritative value based on a logarithmic scale between 1-100. It is generally very accurate as a barometer for how *powerful* a domain is in consideration of that domain’s link profile and citation flow. One limiting factor is that it is calculated based on links contained within MOZ’s web index; Mozscape.
The child of Domain Authority, Page Authority, is the same basis scale as Domain Authority but instead of consideration for the authority of the entire domain, it is scored at the individual URL-level. It is a good indication of how powerful a specific URL is within a given domain, and is a great second-tier barometer for gauging the difficulty to outrank a page.
For Additional Consideration
This one is very obvious, it’s exactly what you think it is. How old is the domain – as in how long has it been registered for? What is the history of the domain and it’s indexed URL’s and links? How many times has it changed owners (WhoIs), IP’s, Nameservers?
The big search engines use some or all of these metrics as trust indicators. In addition, it is speculated that websites less than ~6 months old (think freshly registered domains) are likely to experience what is often referred to as Google Jail if they try to acquire links to quickly.
While Cutts has hinted that domain age may not be a big factor, older more established and more trusted domains are going to have an advantage when it comes to ranking.
What I am specifically talking about here is a general sentiment analysis that can be quickly and manually run for a website’s brand name. In my experience certain search verticals use variations of Google’s ranking algorithms to serve and rank different kinds of results.
In my very humble opinion this is why you will sometimes see search results with a lot of rich snippets, or packets of video results, and even sometimes results from review or complaint websites. In these instances it is important to consider the diversity of the results and think about the experience that G is trying to provide.
If a brand (or Entity) has swaths of negative press around it, from specific kinds of review websites, regardless os how weak that URL may be – it may be harder to unhinge and outrank. I believe this also has a lot to do with the idea behind time to long click, or G’s projected measure for quality / user satisfaction.
Social Media Impact
I’ve saved the most variable metric for last – which is also the hardest to define. What I’m looking at here more than anything else is what do the social properties for this brand look like; how often do they update them? How popular are they? Do they own at least 80% of their brand SERP?
If not, what other kinds of websites are ranking? Are there competitor sites in their brand SERP? (that’s generally a good sign – it means that G does not yet see them as a more established Entity for that keyword).
Getting Realistic with SEO
What better way to crush the dreams of all aspiring SEO’s out there then to break down just how realistic (also read: expensive) it would be to rank on some of the most coveted enterprise SERP’s.
For this I’m going to analyze SERP1 for keywords in the following 3 verticals:
- credit cards
- used cars
- home loans
For each of these verticals I’m going to run a keyword discovery report, select the 3 keywords with the highest search volume (which is not necessarily the seed keyword), and then analyze the rank potential of the SERP’s based on the metrics I listed above.It’s going to be a relatively rough breakdown but my goal is to illustrate the time/money/resources you need to invest if you’re going to crack big money SEO.
The Keyword Discovery Process
The fastest way to generate solid keyword ideas while getting all the important metrics you need, is to use a tool that does all the heavy lifting for you.Fire up what ever your keyword tool of choice is, for this post I’ll be using Term Explorer, mostly because it gives me practically unlimited relevant suggestions (up to 90,000) but it also provides all the search volume and competitive data I need to get started.
Time to Select Our Analysis Pool
Based on the above results I’ve selected the following 9 SERP’s to review for rank potential. I’m going to dissect one of the most coveted terms in search (credit cards) and then select one term from each of the other keyword sets to analyze.