
Google is enthralled with search engine results. They hire world-class engineers, tweak complex algorithms, and enroll remote search engine evaluators to improve them. Google has the most robust search engine on the planet, and it’s getting better every day.
How does Google deliver such splendid results? You know Google considers more than the content on the page as a relevancy signal. New websites don’t top organic search for ultra-competitive keywords, even if they have amazing content. On the other hand, Google doesn’t only rank websites (domains) either. It’s not possible to deliver reasonable search results without evaluating at the page-level.
Pages and domains make up the bulk of relevancy signals. Compared to these two behemoths, subdomains are niche. So, we’ll touch on the limited impact of subdomains toward the end.
Google uses over 200 factors to determine search engine results. Each one of these factors is called a relevancy signal. Some of these factors have a large impact, and some have a minuscule impact. For some search results, Google may not actively apply all 200 factors. The exact algorithm is their secret sauce, and only high-level employees have access to it.
Most of these factors relate to pages, but many signals occur on a domain or website level too. Let’s take a look at a few examples from the Backlinko list. These aren’t necessarily the weightiest factors. We’re simply looking to see if certain signals on the list relate to pages or domains.
Those two searches are virtually identical. Yet, the highlighted article shows up as the number one article in the first search and the number two article for the second search. In other words, “Animals that live in cold weather” brings up a different number one website than “animals that live in cold.” As you might expect, keyword order has a large effect on search engine results.
URL Length- Search Engine Journal notes that a drawn-out URL can hurt the visibility of a single web page.
Bounce Rate- Pages that everybody bounces from quickly aren’t useful. Google takes this into consideration when ranking pages.
Google Caffeine- Google prefers pages that are recently updated. The idea is info from 2016 is more relevant than info from 2006.
Number of Comments- Pages that have lots of comments are looked upon as more relevant.
Site Speed- If one page is overloaded with dense images, Google will downgrade that page. This also ties into bounce rate. If a page takes too long to load, people are going to leave.
Some SEO’s like to say Google ranks pages, not websites. Ultimately this is an moot point. If you have a well-respected site, it’s easier for your page to rank.
Let’s say you have a marketing blog. You and Kissmetrics both write an awesome marketing piece, trying to rank for the same keywords. They are probably going to rank higher than you, even if your article quality is the same. Actually, sites with a high PageRank may outrank you even if your article is better.
This is not to say websites with a high PageRank are unbeatable. The point is a esteemed or valuable domain will give you an advantage. Here are some examples.
“Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain”.
Domain History- A site with a volatile domain history is scrutinized. For instance, five different domain owners in the past six months will raise eyebrows at Google. Additionally, Google records name of spammers. If they register a domain, their sites will be perused.
Country Level Domain- If your website uses a country level code (such as Au for Australia) it makes easier to rank in that country. On the flip side, it makes harder to rank on a global scale.
Site Speed- Yes, site speed works at the website level as well. If you have slow hosting sitewide, your entire website will be penalized.
Sitemap- A sitemap helps Google index all your pages. Therefore, it works at the domain or website level. Brandon from SEO Beast says sitemaps do two things:
Site Reputation- Backlinko says Google uses website and customer reviews as a way to rank your website. If you’re associated with scam, fraud, or poor customer service, that factors in your ranking signal.
Subdomains are relevant to any discussion about SEO. Hostgator gives an excellent definition of subdomains, if you’re unfamiliar with them.
“A subdomain is a second website, with its own unique content, but there is no new domain name. Instead, you use an existing domain name and change the www to another name. The subdomain name looks like forums.domain.com, help.domain.com, help2.domain.com (assuming you already host domain.com).”
In previous years, subdomains of the same main domain were treated as completely different properties. In other words, you could use white-hat SEO techniques on your main domain and black-hat techniques on your subdomain. Now that strategy is ineffective. Brandon from SEO Beast weighs in with an exclusive quote,”
“Nowadays, Google is capable of understanding that multiple subdomains from the same main domain are not actually completely different properties. This can be advantageous to ethical SEO, as subdomain value derived from the main domain is more likely preserved. But, this can be disadvantageous to unethical SEO as webmasters have less chance of promoting a subdomain or two without potentially negatively affecting their main domain.”
Black hat SEO’s keep their dubious tactics far away from their primary domain. It’s too risky. However, domains and subdomains used to be separate. Even if the subdomain was penalized, the main domain would not be. Subdomains are free, so that was a cheaper solution than buying multiple domains. Nowadays that strategy doesn’t work. If the subdomain is compromised, the main domain is as well.
Subdomains are still useful in certain cases. Yet subfolders are typically the best choice. Brandon adds,
“Google is indifferent to either [subfolders and subdomains] and the optimal decision involves context. The pros and cons to both subdomains and subfolders are highly dependent upon the website structure and intent, but I would advocate subfolders unless subdomains truly make sense.”
You’ll hear people say, “Google ranks pages, not websites”. Looking at it that way isn’t helpful. Google uses over 200 factors in their complicated algorithm. Most of the factors relate to pages. Some of them relate to domains. A few relate to subdomains. It’s vital to remember this is a moving target. If black hat SEO’s exploit a relevancy signal, Google will alter its importance. Their algorithm is constantly being tweaked and redefined.
What will remain constant for the indefinite future is that Google will use a mixture of page signals, domain signals, and subdomain signals to rank websites. If you need help ranking higher in organic search, contact us here at Brandignity.
Have you ever used subdomains for SEO purposes? Leave a comment below about your experience.
Submit Your Info and We’ll Work Up a Custom Proposal
Leave a Reply