Google is enthralled with search engine results. They hire world-class engineers, tweak complex algorithms, and enroll remote search engine evaluators to improve them. Google has the most robust search engine on the planet, and it’s getting better every day.
How does Google deliver such splendid results? You know Google considers more than the content on the page as a relevancy signal. New websites don’t top organic search for ultra-competitive keywords, even if they have amazing content. On the other hand, Google doesn’t only rank websites (domains) either. It’s not possible to deliver reasonable search results without evaluating at the page-level.
Pages and domains make up the bulk of relevancy signals. Compared to these two behemoths, subdomains are niche. So, we’ll touch on the limited impact of subdomains toward the end.
Google uses over 200 factors to determine search engine results. Each one of these factors is called a relevancy signal. Some of these factors have a large impact, and some have a minuscule impact. For some search results, Google may not actively apply all 200 factors. The exact algorithm is their secret sauce, and only high-level employees have access to it.
Most of these factors relate to pages, but many signals occur on a domain or website level too. Let’s take a look at a few examples from the Backlinko list. These aren’t necessarily the weightiest factors. We’re simply looking to see if certain signals on the list relate to pages or domains.
Those two searches are virtually identical. Yet, the highlighted article shows up as the number one article in the first search and the number two article for the second search. In other words, “Animals that live in cold weather” brings up a different number one website than “animals that live in cold.” As you might expect, keyword order has a large effect on search engine results.
URL Length- Search Engine Journal notes that a drawn-out URL can hurt the visibility of a single web page.
Bounce Rate- Pages that everybody bounces from quickly aren’t useful. Google takes this into consideration when ranking pages.
Google Caffeine- Google prefers pages that are recently updated. The idea is info from 2016 is more relevant than info from 2006.
Number of Comments- Pages that have lots of comments are looked upon as more relevant.
Site Speed- If one page is overloaded with dense images, Google will downgrade that page. This also ties into bounce rate. If a page takes too long to load, people are going to leave.
Some SEO’s like to say Google ranks pages, not websites. Ultimately this is an moot point. If you have a well-respected site, it’s easier for your page to rank.
Let’s say you have a marketing blog. You and Kissmetrics both write an awesome marketing piece, trying to rank for the same keywords. They are probably going to rank higher than you, even if your article quality is the same. Actually, sites with a high PageRank may outrank you even if your article is better.
This is not to say websites with a high PageRank are unbeatable. The point is a esteemed or valuable domain will give you an advantage. Here are some examples.
“Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain”.
Domain History- A site with a volatile domain history is scrutinized. For instance, five different domain owners in the past six months will raise eyebrows at Google. Additionally, Google records name of spammers. If they register a domain, their sites will be perused.
Country Level Domain- If your website uses a country level code (such as Au for Australia) it makes easier to rank in that country. On the flip side, it makes harder to rank on a global scale.
Site Speed- Yes, site speed works at the website level as well. If you have slow hosting sitewide, your entire website will be penalized.
Sitemap- A sitemap helps Google index all your pages. Therefore, it works at the domain or website level. Brandon from SEO Beast says sitemaps do two things:
Site Reputation- Backlinko says Google uses website and customer reviews as a way to rank your website. If you’re associated with scam, fraud, or poor customer service, that factors in your ranking signal.
Subdomains are relevant to any discussion about SEO. Hostgator gives an excellent definition of subdomains, if you’re unfamiliar with them.
“A subdomain is a second website, with its own unique content, but there is no new domain name. Instead, you use an existing domain name and change the www to another name. The subdomain name looks like forums.domain.com, help.domain.com, help2.domain.com (assuming you already host domain.com).”
In previous years, subdomains of the same main domain were treated as completely different properties. In other words, you could use white-hat SEO techniques on your main domain and black-hat techniques on your subdomain. Now that strategy is ineffective. Brandon from SEO Beast weighs in with an exclusive quote,”
“Nowadays, Google is capable of understanding that multiple subdomains from the same main domain are not actually completely different properties. This can be advantageous to ethical SEO, as subdomain value derived from the main domain is more likely preserved. But, this can be disadvantageous to unethical SEO as webmasters have less chance of promoting a subdomain or two without potentially negatively affecting their main domain.”
Black hat SEO’s keep their dubious tactics far away from their primary domain. It’s too risky. However, domains and subdomains used to be separate. Even if the subdomain was penalized, the main domain would not be. Subdomains are free, so that was a cheaper solution than buying multiple domains. Nowadays that strategy doesn’t work. If the subdomain is compromised, the main domain is as well.
“Google is indifferent to either [subfolders and subdomains] and the optimal decision involves context. The pros and cons to both subdomains and subfolders are highly dependent upon the website structure and intent, but I would advocate subfolders unless subdomains truly make sense.”
You’ll hear people say, “Google ranks pages, not websites”. Looking at it that way isn’t helpful. Google uses over 200 factors in their complicated algorithm. Most of the factors relate to pages. Some of them relate to domains. A few relate to subdomains. It’s vital to remember this is a moving target. If black hat SEO’s exploit a relevancy signal, Google will alter its importance. Their algorithm is constantly being tweaked and redefined.
What will remain constant for the indefinite future is that Google will use a mixture of page signals, domain signals, and subdomain signals to rank websites. If you need help ranking higher in organic search, contact us here at Brandignity.
Have you ever used subdomains for SEO purposes? Leave a comment below about your experience.
When it comes to SEO, there are two types of factors: on-page and off-page. The former is plain to see, but the latter is hidden beneath the surface. Both are equally important, and both offer invisible elements that you may not be aware of. How can you alter something you can’t see?
The answer lies in understanding how these invisible factors affect your rankings and what you can do to influence them. Today I’ll show 10 hidden factors that can hurt your rankings, and how to avoid them.
Content professionals and marketers are all aware of SEO factors like links, keywords, and content marketing. On-page factors like this are hard to forget because they’re staring you in the face. It’s the ones you don’t see that can do the most damage when starting your blog.
Just like there are positive factors, there are also negative ones that can cause you to lose rankings or worse, earn yourself a penalty from Google. In the end, we all want an SEO friendly website, so here are 10 things to avoid:
While building backlinks isn’t easy, paying for them is absolutely out of the question. Buying links has been a major no-no in Google’s eyes since the dawn of the Penguin Update. Your backlinks should be pointing from relevant and credible sources.
How to Avoid This:
When you’re building backlinks, target websites that are within your niche. Earn them through outreach or guest blogging and vary the anchor text you use and the pages you link to for a natural backlink profile.
Duplicate content is something that I myself feared like the plague when I first started working on blogs. I was terrified that something would sound like something else and bring down all the work I had done. In truth, Google’s duplicate content guidelines don’t read like a terrifying declaration like some would have you believe.
In truth, Duplicate content is something you should be aware of, but not something to be feared. You should be mindful of the ways that it can happen off the page:
Here’s a great site to help you find duplicate content.
Don’t make the mistakes above and keep track of where your content is and how its displayed. Repetition is your enemy, so don’t let it get out of hand as you spread, syndicate, and share yoru content.
Your website’s organization is a huge factor. While it may look nice and pretty for people, you should also consider the spiders that will be crawling your content looking for how it all fits together.
The amount of internal links, where they point, and how they flow all adds to optimized internal links, or a lack thereof.
How Can I Fix This?
Start by linking relevant pages together, This will help categorize and enrich the experience for users who are looking for more information on a specific subject. A more advanced option is to create static silo pages that direct users from your homepage to a silo, and then to a category and ultimately a post.
This flow sends link juice through all levels of your domain, thus benefiting the website as a whole.
Broken link building has been a great way to earn backlinks since the dawn of SEO, but what about the broken links on your blog? Are they going to earn you penalties from Google? Here’s the breakdown:
404s are a perfectly normal part of the web; the Internet is always changing, new content is born, old content dies, and when it dies it (ideally) returns a 404 HTTP response code. Search engines are aware of this; we have 404 errors on our own sites, as you can see above, and we find them all over the web. In fact, we actually prefer that, when you get rid of a page on your site, you make sure that it returns a proper 404 or 410 response code (rather than a “soft 404”).
In the end, a 404 page will keep you from getting direct penalties, but what about the users? How would you feel if you thought you had the perfect answer to your question, only to be hit with a 404 page? Not good I imagine. You’d probably never go back to that site.
According to Sacramento Design Network, poor experiences and customer service like this can cost you up to 85% of your business!
What Should I Do?
While Google isn’t hitting your rankings for this, you’re missing out on opportunities for additional ranking by not fixing these broken links with better, more relevant content, or some kind of redirect. Take the time to fix these for yourself and for others to boost your off-page SEO.
The images you uploaded to your content are an absolute SEO factor, but did you know that they have their own off-page elements? If you upload an image with a bunch of numbers for the file name and a blank alt-tag, that picture isn’t doing your SEO any favors.
How to Fix This?
To help Google better understand your images, provide descriptive titles and alt-tags when you place them in your content.
When you publish a post and it’s indexed by Google, the meta description by default is a chunk of the opening text. This doesn’t do a very good job of summarizing what’s on the page for the user or for Google.
What Should I Do?
In terms of the user experience and click-through rates, having a custom meta description will allow you to better showcase the content present on the page.
Including your keywords in the description will also showcase a good summary of what the content is supposed to be about.
Keyword stuffing is a term that refers to an overabundance of keywords on a page. Whether it’s a repeating phrase or simply too many keywords crammed into a piece of content to the point where it’s no longer readable by users.
The exact threshold of what Google decides is too much is a subject of continued debate.
How Do I Fix This?
Boil down your keywords to the ones that you need and the ones that are most relevant to the topic at hand. Stick to these and use them sparingly to avoid any issues.
The speed at which your website loads isn’t just something that affects your conversion rate. As it turns out, website speed is also a ranking factor. A slow website can spell doom for your rankings, and you may not even know it’s happening.
What to do?
Check your website’s speed with a tool like pingdom and if you find it lacking, start looking for ways to optimize your website’s speed through various tactics like optimizing images and utilizing content delivery networks.
An XML sitemap is how Google explores your site and understands its structure. Without this in place, you could be losing out on valuable rankings.
Luckily There’s a Quick Fix:
You can use an XML sitemap generator to create the file. Once you’ve done this, head over to your Google Webmaster account to submit the sitemap directly to Google.
Most comment systems will take care of spam for you, but it’s easy to let these kinds of things build up. These start to work against you fairly quickly.
Make sure your moderating your comments for spam. If you don’t have the time or the right program to maintain this, it’s best that you disable them altogether.
SEO is composed of both visible and invisible elements. Knowing what they are and how to avoid mistakes will skyrocket your SERP rankings. What invisible factors were you forgetting about? How do you ensure you’re not letting these things hurt your rankings? Let us know in the comments!
Carrie Davidson is a seasoned blogger who has helped launch numerous blogs in her online career. She is an expert in crafting excellent posts with great content and powerful headlines. You can find her online @carriedavidons1
Most site business owners, webmasters and bloggers know how challenging it can be to ensure that they are only using white hat SEO methods, while also strictly adhering to the current Google guidelines and recommendations. However, there is an extremely handy resource available as a PDF document from Google (thanks Google!), which clearly lays out what is required from website owners, bloggers and webmasters to ensure that their sites are fully compliant – and remain so.
Here is the outline:
As you can see the table of contents is pretty awesome!
When going through this handy SEO guide, readers will see that it has been divided into sections according to the different aspects that need to be addressed. The first section provides a general overview of the current Google guidelines, and it addresses topics such as basic internet safety information, the purpose of search quality rating, how website owners have to represent the users in the locale that they are representing, basic browser requirements and ad blocking extensions. Users will see that information that pertains to virtually every aspect of the quality of their site content as well as their SEO practices is covered in a lot of detail.
Here’s a break down:
This part covers numerous topics pertaining to the quality of page content and whether the content in question serves its intended purpose. An entire section is devoted to the overall purpose and quality of a webpage in that each page on a website must have a purpose. Are the pages helpful to site visitors, or have they been set up purely with the intention of making money? Has a specific page been set up purely to harvest visitor information in order for it to be used for malicious intent (hacking)? (Hint: if any page has been set up with the intention of harming end users or solely to make money and not provide useful information, it will immediately be classified as low quality).
There has been much discussion over the past year or two about ensuring that websites are mobile-friendly, as more and more smartphone users are accessing the internet by means of these devices. One of the best ways to ensure this is to use a website template that has been tested and deemed to be fully responsive, as this will enable it to be fully functional on screens of all sizes. This aspect of web design has become such an important aspect of SEO that 20 pages of this guide have been dedicated to it. As a result, it is recommended that webmasters and site owners test the sites from as many mobile devices as possible before allowing them to ‘go live’ and be viewed online.
When it comes to ensuring that a site is responsive, the ranking scale goes from Fully Meets (FullyM) down to Fails to Meet (FailsM). Very few sites receive the FullyM rating, because it would have to cater for absolutely every mobile user and platform (which is almost impossible). Sites that are rated Highly Meets means that the site is of great help to most mobile users. Receiving the Moderately Meets ranking means that a site is helpful to some mobile users, but others still require additional information. The Slightly Meets ranking means a site is helpful to fewer users, and most need more information, while a FailsM ranking means that it does not meet the needs of any mobile users.
Many other rating factors are taken into consideration when sites are being evaluated for adherence to the Google guidelines. These include, but are not limited to foreign language barriers, didn’t load, hard to use and illegal images categories. Sites that are not displayed in English or that have poor translations available, pages that don’t load or that are too difficult for users to navigate properly and sites containing illegal images will receive low rankings. Illegal images are usually those that are classified as hosting copyrighted images, other photos that belong to someone else, while sites that are defined as being difficult to use are usually those that have been poorly laid out.
Webmasters who may have been tearing their hair out trying to ensure that their sites are fully complaint with the current white hat SEO practices as well as the required Google guidelines and requirements will be able to benefit tremendously when they read through and apply all of the information contained in this handy step by step guide.