Home » Blog » 150+ Google Ranking Factors


150+ Google Ranking Factors that work, and are easy to implement.

In this guide, I am going to cover 100’s of Google ranking factors and how they apply to your website. I will not only list out the signals, but I will include resources to help you diagnose your website. There is absolutely tons of juicy content below.

If you would like to skip ahead to the Google ranking factors that apply to your website, then please use the below contents to skip to the section that applies to you.

 

Domain Factors


1. Domain Age & Registration.

Websites that have existed for a long time, not swapped ownership frequently, and expire in the future are all trustworthy signs for Google. Spammers often only register their domain for a single year, so it’s worth purchasing 2 – 3 years on your domain. This is convenient long-term and also builds trust in the domain.

WHOIS Important Dates

 

2. Domain Name.

The name of the root domain is considered a ranking factor. There’s a lot of benefit to people who choose exact or partial-match keywords as the domain name. However, I would personally suggest picking a branded name instead.

There are a lot of benefits for big brands, you can check these out further below.

Domain Name Breakdown

 

3. Domain History.

The history of a domain is what activity happened on the site prior to it being owned by you. Google has said in recent history that they will reset backlinks to expired domains. This would improve the number of available domains that are not negatively affected, but stop people from building Private Blog Networks. However, recent studies suggest that Google has not done this as of yet. I think it makes sense for Google to do this moving forward, so it is hard to say this will still be a ranking factor in the future.

If you would like to view the history of your website then you can use the Wayback Machine. You can use this to see if the domain was used for SEO purposes and then cross-reference with AHREFS, Majestic and SEMRush to see if the site received a penalty.

Wayback Machine Internet Archive

 

4. Public vs. Private WhoIs.

Having a public WhoIs makes your website open to spam, whilst a private one removes information from Google. This has been complicated by the GDPR, and so this may be removed or adapted in the near future.

WHOIS Contact Information

 

 

Page Level Factors


5. Page Title Tags

Including a target keyword in the title can help Google to determine what your content is relevant towards. The best practice is a single page title between 40 – 65 characters and should include your target keyword as well as relevant terms.

Here are some examples of websites to do with paleo diets. They’re all including the object “chicken” and include lots of content about Paleo diets and recipes. There’s room for a lot of variation in a successful page title, so don’t feel trapped.

Page Titles in the SERPs

 

6. H1, H2, H3 Tag Optimisation.

The three most important heading tags are H1, H2 and H3 tags. The strongest weighting seems to be the presence of h1 and h2 tags. However, there are tons of rules on how to optimise them properly. In general, you should follow these tips:

  • Single H1 Tag per page, optimised for your target keyword.
  • H1 Tag is not surrounding the logo, but is actual text on the page.
  • Page Titles should be different from the H1 Tag, try to be creative.
  • Multiple H2 Tags are perfectly fine, try to avoid duplicates.
  • Include relevant keywords in your H2 tags, but don’t be repetitive.
  • H3 Tags are for headings inside each H2 tag. For example, all of these headings are H3, inside the heading Page Level Factors.

Google Heading Optimisation

 

7. Meta Description Tags.

The meta description has for a long time not been a ranking factor. It is not something you ultimately need to worry about. However, many SEO’s would argue that they can still be optimised to improve rankings.

Since Click-through Rate, Bounce Rate, and Dwell Time are often considered ranking factors, it makes sense. It’s really as simple as these two steps:

If your meta description encourages users to click your link – great!

If your meta description is accurate to what your page will achieve – great!

Then just throw a keyword in that you’re targeting, and you will rank in no time at all. This is one of the most overthought Google ranking factors. It’s really that simple!

meta descriptions from google search for asda.com

8. Relevant Content.

The content on a page should be relevant towards the purpose of that page. If your page title, heading, and meta description tell users you will provide them with 10 tips to save time, that should be the core focus of the page. It’s fine to include affiliate links here, but providing 10 products is different from 10 tips.

The way that content is calculated as relevant is through multiple processes. The most popular for information retrieval is called TF-IDF. This is a measurement for Term Frequency, Inverse to the Document Frequency. The more you include a word, the stronger the affinity.

This is offset by how often that word appears across your site, niche and the web. The more common that word is on the internet, the less power it holds. This eliminates basic words such as “the” or “and” from equations.

9. Keyword Density.

To be clear, including your keyword in the document is not the same as keyword stuffing. If your page is about care services, you should be talking about the care you provide. This will naturally lead to referencing care throughout the document.

This tactic differs from keyword stuffing. Instead of looking to replace pronouns with your keyword, we’re looking to just stay on topic. This should be completely natural, and if you write for your audience it will be no problem.

I personally use the multi-highlight chrome extension to help with this. It can quickly highlight your terms within the page.

10. Appropriate Content Length.

You have heard that Content is King, but it’s important to balance an appropriate amount of content for your users. Since the content is just one of the ways that Google determines relevance, it can be supplemented with well-optimised headings, titles and links.

Looking at hairdressing salons it is clear websites include significantly less text. The same is true for many non-informational queries. Even your product pages and categories can have just 200 – 300 words of content on them.

Toni & Guy Hair Salon

11. Latent Semantic Indexing

Latent Semantic Indexing is a single value decomposition process (look it up), with the goal to group words that are related. However, most of the industry refers to synonyms and related queries as being the same thing as LSI.

Instead, you should consider the terms that disambiguate your keyword. For example, if your article is about apple, which type of apple are you referring to:

  • Apple, the technology giant.
  • Apple, the record label.
  • Apple, the fruit.
  • Apple City Hotel, the accomodation in Berlin.
  • Apple River, the town in Illinois.

This should come naturally, and doesn’t require any special tools. When talking about a hospital, I refer to nurses, doctors, physicians, appointments, etc… When talking about a cat, I talk about paws, whiskers, fur. These terms help to categorise your content, which makes it easier for Google to provide great results.

However, you can find out more about latent semantic indexing in my keyword research article.

12. LSI Keywords in Titles, Headings, and Descriptions.

Let’s take the example from above, and discuss Apple, the technology giant. The article I am writing is about the Apple Watch. However, both of these terms have broad semantic meanings. Apple could refer to many different things, and watch can be a noun or a verb.

Therefore, to help us disambiguate  our article, we could change our title, headings or tags to include time. Consider the following meta description:

Telling you the time is just one of the many great features about the new Apple watch. Find out how this single product will revolutionise your daily routine.

It’s a solid meta description that is short and sweet. It covers the topic of the page, and gives users a hint to what the purpose of the page is. But be careful to not over-promise your content.

13. Pagespeed.

A long page load speed does not indicate that a site is bad, it simply indicates that it takes a long time to load. In fact, sites with lots of dynamic content will load slower than others.

If you have been biting your nails, and ripping your hair trying to go from 2 seconds to 1.8 seconds and beat your competition – then I’m going to save you time.

Page Speed is a threshold ranking factor, and likely a negative one.

That means if your page loads slower than X seconds, it’s going to incur devaluation. This means that if your site isn’t slow, you’re not going to get a big boost from reducing page speed. You can see this with Pagespeed Insights.

Google marks everything with a score above 80 as being Good.

This doesn’t mean you shouldn’t try to get closer to 100, but once you’re marked as Good – you know that Google is currently happy with your site speed.

14. Duplicate  Content.

In the second situation, the site owner has chosen to use WordPress, but wasn’t aware of several features to the blog that caused duplicated content. In WordPress there are /category/, /tag/, /author/ and /archive/ pages that include links to your articles. These are great value for users reading your blog.

However, these are not good for Google crawling your website. The pages all include the same snippets of text as your main blog pages and appears as duplicated content. To avoid this, you can quite easily noindex,follow on the pages. If your site is small, then this is the only step you need to take. However, if your site is large and you have used a lot of categories and tags, you may have an issue with crawl budget. Whilst Google will not index those pages, it will still crawl them. To solve this, write the following in your robots.txt file:

Disallow: /category/
Disallow: /tag/

This can normally be detected quite easily by using Siteliner to check for internally duplicated pages. For example, I chose to search for a random wordpress blog on the internet as an example.

Siteliner Duplicate Content Example

15 Syndicated Content.

Google claim that syndicated content is not a problem if it’s legal and beneficial. In this case, they would judge the website based on its unique selling points. However, from my experience the syndicated content can cause real issues.

For news websites that syndicate content, this may be an area that they cannot avoid. However, for most websites there are options available. For example, the most common syndication that I see is Pinterest.

Instead of treating all your product and service channels as a single channel – you should differentiate them. Furthermore, you should provide unique content that is great for each channel. A great idea for Pinterest may not look the same as Facebook. That’s not to mention your audience may be different.

16. Image Optimization.

One of the cornerstones of SEO, image optimisation is often misunderstood and left alone.

In an effort to optimise every page titles, heading, meta description, and backlink – people forget to ever do their images. This is partly because WordPress does not have great facilities for bulk updating your images.

The main things to remember include:

  • Alt Text – this is what the user will see if the image is broken. Particularly useful for visually impaired people that use screen reading software. Over-optimising this is bad SEO, but it’s also bad user-engagement. If you value your users, be descriptive.
  • File Name – this is something you really need to remember. File names matter. Period. It’s really as simple as naming your file what it is, instead of some random series of numbers and letters. There’s really no excuse, because it makes your life so much easier when browsing the media libraries of your own website on WordPress.
  • Title Tags – An often overlook opportunity is to add a title tag for your images. These are little boxes that appear when people hover over your image. Since this is visible to people, not just robots, you will definitely need to optimise this for reading.

17. The Recency of Content Updates.

Some people are skeptical about this one, but it’s one of the Google ranking factors that make sense. If your website is covering a topic from 2014, and a new site covers 2018 – yours will be more relevant for unique queries including 2014. However, the one that is more recent may be more relevant for broad terms.

This isn’t always the case, because great articles can last many years without needing to be refreshed. So I personally lean towards saying this is a small ranking factor. If you deliver good content, and market it properly, there’s no problem.

18. The Magnitude of Content Updates.

Updating your website will show Google that your community is still active. However, not every change is the same in scope. Therefore, the bigger the changes you make the more likely Google will evaluate the content as being fresh.

It’s my opinion that content freshness isn’t a huge ranking factor for many niches. But, there are some industries that this could be helpful. I would suggest that news websites are a great example of where freshness is important.

It’s also important to note this may just be correlation. If you’re making big changes to your website, you’re likely to see changes to your rankings. Therefore, it’s hard to decouple this from just making changes to your website.

19. Keyword Prominence.

The prominence of your keyword is important in setting the purpose of your page. If the page is about support for war veterans, then it’s likely your heading at the top of the page. This is likely why many SEO’s consider above-the-fold content to be so valuable.

However, it’s important to note that prominence is not the same as frequency. You can show your user and Google that the page is about a specific topic, without having to stuff the page with that term. Therefore, just focus on making your page clear on what it’s about for the user.
Keyword Prominence

20. Outbound Links.

There’s very little data that suggests outbound links lower your website’s authority, but people are scared to do so. It’s a fear from the early years of search and is one that generally harms websites. A well-placed outbound link can help users.

When you have a good opportunity to link towards websites, it’s almost always a great idea to do so. If you see some value in that article, it’s worth sharing a link to help them out. Before Google started using anchors as a ranking factor – this was how engineers designed the internet to work.

21. Grammar and Spelling.

This is largely considered to not be a ranking factor by most SEO’s. However, I’ve personally found that well-written content performs considerably higher.

Regardless of whether grammar and spelling are important for ranking, it’s important to users.

Matt Cutts points out that pages with good grammar and spelling typically rank higher. This, he claims, is a correlation, not causation. But if there’s high correlation between good content and good results, it might become a ranking factor eventually.

 

22. Helpful Supplementary Content.

Google classifies the main content (MC) as being the content that is related to the purpose of the page. For example, in this article, you’re reading the main content right now. However, supplementary content is things in the sidebar, navigation, and breadcrumbs.

It could also include related posts at the bottom of the article or tabs with more information on an eCommerce platform. This might provide information on the height, width, and colour of your product – or it could just be reviews and comments.

Whilst the main content should fit the purpose of the page, the supplementary content should help concrete it. It should show off your expertise, establish yourself as an authority, and build trust with your users. For me, this is about providing testimonials and case studies. They help showcase the results of doing what I’m telling you so that you can trust the results.

23. Multimedia.

The main content does not always require multimedia, and Google has stated that there’s no requirement for it. Therefore, if you have 500 words of content, you don’t need to consider having 2 – 3 images to meet a minimum requirement.

However, despite this being said, I’ve always found that multimedia provides a more engaging article. The written content is my staple, but I love to provide images to support the argument. However, sometimes I provide a video to give more contextual information.

My clients can shorten their content, nine times out of ten. It amazes me how unwilling people are to invest in video and photography to improve their website.

24. The Number of Internal Links Pointing to Page.

Internal linking is a great ranking factor. It’s one that is easy to control, can pass relevance and authority – and it’s low effort.

In simple terms, the more internal links pointed towards a page, the easier that page is to find. This means that Google will value that page as being important for your website. However, despite this, many people read that internal links are good and try to hyperlink every other word in a paragraph.

Think about it like this:

If you have 100% authority, and 10 links, that’s a 10% weighting for each of those links. If you have 100 links instead, there’s now a 1% weighting for each link.

You want to make sure that your posts are linking to as few topically relevant pages as necessary to deliver a great experience. Phew, what a mouthful. To summarise, use links sparingly and aim to help the reader.

25. Quality of Internal Links Pointing to Page.

This is somewhat a continuation of the above point. However, there’s a twist.

Whilst including internal links on your website is great, the placement is important. This is because if you include contextual links – they mean more. Both to Google and to the user. It’s really obvious, but if you’re talking about dog food and have a link to your dog treats page – it’s something your users might also be interested in reading.

Then after that I generally weigh links in this order, best to worst:

  • Sidebar
  • Top Navigation
  • Footer

I can see arguments for swapping around he sidebar and top navigation. However, my reasoning is that sidebars often include topically relevant articles. This differs from the top navigation that is prominent but includes links to every page.

If your side bar includes the exact same links across every page of your website, then I would say they’re equal in worth. However, specifically for people that take sidebar links seriously and make it useful for users – those people are helping people to navigate the website.

26. Broken Links.

Being hit with a 404 page is not useful, and kinda sucks. But it generally is not a direct Google ranking factor. This is because it’s hard to really punish a website for occasional errors, especially huge websites.

However, there are 3 indirect ways that 404’s can be hurting your website:

  • Broken Links – if your website has backlinks pointed towards the page that is returning a 404, you’re losing authority. If the page is permanently gone and has backlinks, you need to write a custom 301 redirect to your new page.
  • Soft 404s – if Googlebot crawls your website multiple times, each with an error, it will ignore your page. It may still exist, but Google will treat the page as though it has nofollow links and no longer give your website credit for it.
  • User Experience – if your website has 1 or 2 errors, it’s unlikely to hurt your user experience. If you have 100’s of them, you may be getting unnecessarily high bounce rate. These user metrics are often considered strong ranking factors.

Check out my guide on how to use Screaming Frog, to help clear up your broken links.
Identifying Crawl Errors in GSC

27. Reading Level.

When it comes to readability, there are three metrics that I always use. These are:

  • Gunning-Fog
  • Automated Readability
  • Flesch-Kincaid

All three of these scores are calculated slightly differently, but my favourite is the Flesch-Kincaid score. It’s based on the number of syllables and words in each sentence. The syllables are easily counted by each vowel in the word. If a word has two vowels next to each other, such as sweet, then this is counted as a single syllable.

I personally love using the Tray Readability Tool. It’s a simple and easy-to-use chrome extension to quickly get metrics.
Tray Readability Tool

28. Affiliate Links.

There are good and bad affiliate sites, and it’s obvious which category a website falls in to – mostly by their business model.

A website such as Skyscanner, or CompareTheMarket – these are high-quality affiliate sites. Whilst they make their money from comparing and selling services; both are so useful they are a service in their own right.

On the other hand, Amazon affiliate sites are often just glorified link building schemes. People using Fiverr content, and tons of shady backlinks. This is the type of site that has little inherent value, so it borrows it from backlinks. These websites tend to get penalised with every Google update that comes out, no matter what strategy they seem to use.

Whilst I have no moral objections to an affiliate site, I only work with an affiliate site if it has value added. This is something that users want, Google wants, and ultimately what we as business owners should want for our products too.

29. HTML errors/W3C validation.

This is a contentious point, and most people don’t agree. But hear me out before you skip this step.

Google is a crawler that attempts to understand your page’s content and establish the authority of your website. Therefore, your goals should always be to improve the readability of your website and establish your brand.

I find that HTML errors and W3C Validation is useful for this. To check for validation, you can use the W3 Validator owned by the World Wide Web Consortium. They’re the ones that set the valid markup, so their tool is reliable and generally up-to-date.

When you run your website through, you’ll no doubt find tons of micro-problems. These are tiny things for a robot take an effort to understand. Therefore, to improve readability of your content – just remove as many HTML errors as possible.

There are no signals for valid or invalid markup, but it removes the dependency on scripts to interpret your content. In my opinion, that’s a good thing.

W3C Validation Checker

30. Page’s PageRank.

The score given by Google’s internal PageRank algorithm is going to impact your ranking. More often than not, issues with a specific keyword or topic can be traced to page-level ranking factors.

To improve your PageRank, you should seek good quality backlinks towards that page. This will improve the page’s individual ability to rank, but also improve your site-wide ranking. Whilst factors such as targeted anchor text can improve relevance – the main goal is to acquire links.

However, you should be careful not to build too many internal links. You can find out more about analysing link distribution in my Ahrefs Guide.

31. URL Length.

The length of your URL is something that you definitely want to keep short. There is tons of data that shows a correlation between short URLs and higher rankings. Whether this is causation or correlation doesn’t matter.

Use around 2 – 3 words to summarise your page, a preferable use your keywords. For example, this article is about Google Ranking Factors, and so the slug matches that.

Many people fall into the trap of creating long URLs that include irrelevant or specific information. This can create bad URLs such as:

  • /google-ranking-factors-in-2018/, which is bad because it will change each year.
  • /the-best-google-ranking-factors-for-ranking-in-seo/, which is also bad because it looks awful in the SERPs.

32. URL Path.

The URL Path is everything that comes after the domain name. It signals to the robots and the user, what is the file and where it is located. This is essential for the internet to work, but the URL paths can be done so badly.

Here’s an example of an imaginary website targeting car parts:

http://fakecarparts.com/car-parts-blog/spare-car-parts/reviews/the-best-shop-for-spare-car-parts.html

The problem here is that the URL path is long, stuffed, and inefficient. It’s hard for the user to read the file name in the SERPs, and all you want Google to see is that the article belongs to your blog. This could have a shortened URL path such as this:

http://fakecarparts.com/blog/best-shop-for-spare-car-parts.html

33. Keyword in URL.

It’s not likely to be a surprise for people, but including your keyword in the URL is a big ranking factor. It’s absolutely huge.

Recently a new colleague asked me a simple question.

“Is it really a big ranking factor? Really?”

So I said let’s check the SERPs and see what they have to say. I picked payday loans because I had been working on a campaign there, and the below screenshot proves the point.

If you look at the below screenshot and don’t think this is a big ranking factor – the rest of the article isn’t much use for you.

34. URL Parameters.

The URL parameters are often used on website’s to provide the server information. This could be the session ID number, page number, or a search query. There are countless ways to use URL parameters to control your content.

Therefore, it’s important to provide Google with information on how each parameter works. You can do this by logging in to webmaster tools and visiting Crawl > URL Parameters.

Once you have acknowledged this is an advanced feature, you will have the option to view your URL parameters. Next, you should click edit and a pop-up box will ask for specifications. If the parameter tracks usage, such as a session ID, then you can mark it “No: Doesn’t affect page content.”

However, if the parameter does control your page content, you will be asked to provide more information. This includes whether the parameter does one of the following:

  • Sorts the content, such as by name, brand, or price.
  • Narrows the content, such as all medium sized dresses, or all red shoes.
  • Specifies the content, such as a product number or article number.
  • Translates the content, such as between English and French, or between USD and GBP.
  • Paginates the content, such as categories or multi-part articles.
  • Other, features that are not included by any of the above, and do not track usage.

URL Parameters

35. Bullets and Numbered Lists.

This is an interesting one, because adding lots of bullets and numbered lists will not dramatically improve your rankings. However, they are part of a good user experience and can be really useful.

For example, if you are trying to highlight three small and simple points, then lists are for you. It’s such a great way to provide easy to digest content that is neatly formatted.

I’m not advocating that you start spanking it and placing lists everywhere. However, where a list is appropriate, then use either bullets or numbered lists.

If your question is when a list is appropriate, it’s when you’re listing multiple items. The clue is in the title on this one.

36. Page Age.

It’s important to maintain your content and keep it fresh. However, old pages have many benefits, including the trust that they build. Therefore, consider these three tips:

  1. Links can be accumulated over time, this will help with PageRank and Trust for your articles.
  2. Comments at the bottom are removed each time you create a new article, keep the conversation alive for longer with old content.
  3. Refinements of your user experience take time, it’s worth constantly tweaking your content.

The combination of links, comments, and constant refinements will help to create a great piece of content. There are too many things to consider before your first launch of the content. It’s natural to make improvements over time.

37. User-Friendly Layout.

For a long time, Google has been able to render pages the way that the user will see your content. This allows Google to analyse where you’re placing your images and advertisements, as well as checking how small your font is.

These elements can lead to a good or bad experience. With mobile-first indexing, it’s even more important than ever to check for mobile usability. You can do this using Google’s Mobile Friendly Testing Tool.

Another great area to check is the mobile usability score in Google Search Console. To find this, you can log in to webmaster tools and visit Search Traffic > Mobile Usability.

38. Purposeful Content.

The purpose of the content should be a fitting match to the heading and page title. If your content is about helping people find a solution to a problem, then the main content should be helpful. However, if it’s an image gallery – then your main content should be an image.

It’s not important to have thousands of words of content on every page. It is, however, important to provide purposeful content that matches what you’ve told the user you will help them with.

An example of this is having a page named as a finance calculator. The content could simply be the calculator, with some supplementary content. However, if you’ve just provided an Amazon link to a calculator – this is not what the user expected. Therefore, be clear what your page is about, and create content that is purposeful.

39. Noindex Tags.

The noindex tag is one of the strongest ranking factors a website can implement. There’s no need for Google to honour the tag but it often does. So it’s clear that you can lose a lot of traffic by accidentally adding these to your page.

The most common use for noindex tags is on low-quality pages that you want to keep out of the index. This should be used for the non-essential category, tag, and media pages. The new Search Console design includes feedback on these pages:

Excluded Noindex Tags in Google Search Console

40. HTML Lang & HREFLang.

By default WordPress is set to English (United States), which creates the HTML language code “en-US”. By changing this setting to the correct language variation, you can help Google see that your page is English (United Kingdom).

Include a self-referencing hreflang tag to further highlight that your content is set to English. By default Google will mark the lack of hreflang tag with an error in Google Search Console:

Google Search Console International Targeting

41. Canonical Tags.

Self-referential canonical tags can help Google to see that a page is the primary version of itself. When you canonicalise another page towards it, you’re telling Google to pass all the value including links, anchor text, and any noindex tags on that page.

It’s important to understand how the canonical tags work before you implement them. Check out the Google Webmasters Blog article on Specifying Canonical Tags.

Canonical Tags are Hints Quote

 

Site Level Factors


42. Robots.txt.

Whilst the robots.txt offers no boost to rankings for its presence, it does have a huge impact on your site. By including a disallow across the entire website, you’re able to destroy Google’s crawling; whilst blocking CSS or javascript, you’re able to prevent rendering of your content; with crawl-delay, you’re able to harm how many pages Google can crawl. To me, these things are the most essential parts of ranking and blocking it has a huge impact.

Whilst it’s not a direct ranking factor if your whole site is performing bad – check the robots.txt. You can also test this in Google Webmaster Tools. To do this, visit Crawl > Robots.txt Tester and check that important URLs and pages are not blocked.

Robots.txt Tester in Google Search Console

43. Site Removal.

Using the URL Removal Tool to do a site-wide removal is the fast way to lose your rankings. It’s something that I wish I had never seen before, but unfortunately, it happens. On three separate occasions, I have seen a client perform site removal by accident. This completely destabilises rankings.

If you want to check your site removal, you can do one of the following in Google Search Console:

  • Check the Google Index > Index Status tab to see how many pages are indexed.
  • Check the Google Index > Remove URLs tab to see if you have performed a site removal.
  • Perform a site search in Google to see if there are any pages.

Site Removal in Google Search Console

44. Content Provides Value and Unique Insights.

There is a need for each individual page to offer value and unique insights. However, to stand above your competition – your whole website should offer value. This could simply be dedicated to offering great video reviews, free advice, online community and consultation.

But the important thing is to carve your space within the industry. If you’re choosing to be an affiliate, then select a spokesperson to become the face of your website. Network with individuals from other industries, and have them use their insights on your website.

Try to offer content that is agreed by experts, providing lots of authoritative content, and delivered in a way that users will trust.

45. Contact Us Page.

Google is very interested in the authorship of content. There is nothing trustworthy or authoritative about John Doe, who writes all his thoughts down. However, if he has a PhD in Marine Biology; you could probably trust his expertise.

However, it’s not easy to match up anonymous names on the internet, with real people. Therefore, a contact page is a great way to reveal to your user and to Google – hey, this is who I am.

There are good reasons to be anonymous, so this isn’t a direct ranking factor. But if you can provide contact details – it’s a good way of building trust with your users.
Contact Us in Navigation

46. Domain Trust/TrustRank.

The amount that Google trusts your domain is going to play a big part in how well you rank. There are tons of factors that play into this trust and referring domains is just one of those. To get an idea, you may use third-party metrics to help understand how trustworthy your site is.

I personally like to use Domain Rating, but I often combine this with keywords and traffic. If your website is ranking well for lots of terms, it’s probably trustworthy – at least according to Google. Other great metrics to use include TrustFlow and TrustRank.

AHREFS Guide

47. Site Architecture.

The architecture of your website is the way that folders and pages are laid out on your server. This architecture typically then relates to how your website is interlinked. For example, your root folder may include the homepage and other site-wide pages.

Shopify provides a great example because everything is categorised as follows:

  • /pages/ is for all your pages.
  • /products/ is for all your products.
  • /collections/ is for all your collection of products.
  • /blogs/ is for all your collection of blog posts.

However, how you link these in your navigation will also play a part in this. These links are called ‘levels’ or ‘crawl depth’ and your main pages should be 1 click away from the homepage. This helps users to quickly navigate towards your best content.

Screaming Frog Site Structure

48. Site Updates.

Site updates often come with a monetary cost, and sometimes a short-term ranking one too. But that doesn’t make them bad for your website. In fact, an update to your website could be all you need to take your campaign to the next stage.

Here’s an example of one client of mine that did a rework. When they went live it caused a whole bunch of issues. This resulted in some lost rankings, which are gradually returning as Google re-crawls the website each day.

However, despite being down in traffic over the past 30 days, their revenue is up by 32%. This is because the update to their website has made each user more likely to convert. For an eCommerce, this offers a huge return on investment over-time.

Organic Revenue Increase

49. Number of Pages.

To become an authority, you need to create satisfying amounts of main content. A single article on your website can have a big impact, but it does not establish you as an industry leader. So, therefore, to be recognised as authoritative you will need a lot more content.

However, it’s important to note that the number of pages is not a direct signal. It’s not the case that 1000 pages are better than 100 pages. So, don’t fall into the trap of creating lots of small pieces of content to pad the numbers.

Instead, focus your efforts on creating valuable content in single topic articles. These can be interlinked, helping the user to answer their questions quickly.

Index Status in Google Search Console

50. Presence of Sitemap.

The presence of a sitemap is really easy to set up on your website, and a huge benefit for your site.

There are two types of sitemaps that you will need to know about. Both of these are recommended by Google:

  • HTML Sitemap – a user-friendly sitemap, normally placed in your footer. This should include links to all of your most important pages that users will need to find. This shouldn’t include every possible page, just the most important categories, informational pages, and posts.
  • XML Sitemap – a robot-friendly sitemap, normally placed in your root domain. This should include either a library of sitemaps or every single link on your website. If you use YoastSEO this will normally be sitemap_index.xml, page-sitemap.xml, post-sitemap.xml, and various others. Non-essential sitemaps such as post_tag-sitemap.xml can be removed.

The important thing is that a robot is only interested in the naked URL, priority, update rate, and last updated data. This is easily generated by Screaming Frog. The objective is to improve the crawl rate for the website by providing every important URL.

With an HTML Sitemap, you’re trying to provide meaningful URLs for users, these should include human-friendly anchor text. To prevent the list from being too long, you should only include the core pages, and group them into categories for the best user experience.
Submitting sitemap in Google Search Console

51. Site Uptime.

It’s not likely this a Google ranking factor by itself. If we pose the statement like this: ‘oh the site is down very often, so let’s just penalise it.

That’s unlikely.

However, if every time Google visits your website the server is down – there’s a problem. After a while, their algorithm treats that page as a soft 404. This means that the page may still be there, but is not returning any content.

Since a soft 404 is not good for users, it will be devalued from the SERPs and no longer be able to rank. So whilst it’s not a negative or positive ranking factor, it matters. Having a great server uptime will reduce the chance of being marked as a soft 404.

52. Server Location.

Whilst it’s a very small signal, the location of your server may help Google to understand your target audience. However, as the internet moves closer towards VPS and cloud technology, we can expect this to stop being a signal for geolocation.

 

53. SSL Certificate.

there is really no need to string this one out with a long description. This is a 100% confirmed Google ranking factor. They themselves announced this in the HTTPS as a ranking signal post that they wrote. It would be silly to argue against this one.

However, more notably, how you set up your https:// protocol is going to be a contributing factor towards how well you rank. You will need to take a break and read my guide on setting up HTTPS for your website.

This guide will help you cover all the mistakes that I find from regularly working with websites.

Paragraph from Google HTTPS Article

54. Terms of Service and Privacy Pages.

If you are running any type of data collection then privacy policy is a legal requirement. To get this done, you can simply include a privacy policy in the footer of your page. This helps users to see how you’re collecting data and using it.

The terms of service pages are for how your users can use your content and products. This protects you from damages if your advice causes them any unintentional harm. It’s also great for establishing trust between yourself and your users – because there’s a clear explanation of your service conditions.

For me, this is an absolute essential for every website on the internet.

55. Breadcrumb Navigation.

The ability to navigate between pages on your website is one of the biggest ranking factors for Google. It’s an essential way to navigate for users, and important for Google to understand site structure. But it’s not often utilised.

If you’re using WordPress, then Yoast includes a breadcrumbs feature. It’s easy to set-up, and it will help your users.

I’ve never seen a website lose rankings from having good navigation. So if you’re not currently using breadcrumbs on your website, it’s a safe bet to set them up. You can find more information about this on Yoast’s Guide to Implement Breadcrumbs.

56. Mobile Optimized.

this one is going to impact users with lots of mobile traffic more than desktop users. However, I rarely come across websites that are 100% organic traffic from Desktop.

It’s quite typical for me to see websites that have large desktop audiences, and they still hold around 30 – 40% of their traffic from mobile. Other websites I have worked with include up to 75% traffic from mobile, which is a very different story.

There are lots of different things that you can do to optimise for mobile. Here’s a short list of my favourite ways to optimise:

  • Accelerated Mobile Pages – this is a movement lead by Google and many other big businesses that wanted to deliver a better mobile experience. In a nutshell, it processes your website in a way that renders above the fold faster than the rest of the page. This means that the website appears to load immediately for users. Cool technology endorsed by Google, it’s a winning formula.
  • Buttons – one thing people forget is that people have fat fingers. This is a real issue that people have with using your website for mobile. Spreading your buttons out and making them large will help users to click them. It’s also something that Google recommends – so worth doing.
  • Colour Coordination – many websites think its okay in 2018 to get away without organising their colour palette. It’s absolutely not okay. If your fonts are a dark blue on a light blue background, I’m glad you like it – but change it to black on white. It’s easier to read.
  • Horizontal Scrolling & Zooming – this fits into responsive designs below but deserves to be stated alone. There’s no need for horizontal scrolling on mobile. Similarly, if you have to constantly zoom out and zoom in to read the content, something is horribly wrong with the user experience.
  • HTML Phone Numbers – in early 2017 I was using a windows phone. I’m glad those days are over. However, one thing it taught me is that HTML tags for phone numbers are super useful. Instead of having to copy and paste the phone number and then open up my phonebook to dial that number, I want to just click it. This is what a simple <a href=”tel:”></a> tag can do for you.
  • Large Fonts – up for debate on whether there’s a ranking factor for this one alone. If your website is hard to read on mobile, then it sucks. Simple. I find that size 16px is a great size for people to read, and strongly advocate this sized font for mobile.
  • Responsive Designs – simply put, if your website on mobile is just a small version of your website on Desktop – then you need to redesign. It’s 2018 and making mobile-friendly websites is super easy with all the technology available. Make it fit mobile devices, and use a different navigation style that users are familiar with.

57. YouTube.

If you’re doing video, and you want it to rank well – then YouTube is the platform you need to be on. There is a clear reason for Google to prioritise YouTube over any other platform. It’s also a great opportunity for you to add a link back to your website.

This is something that can be abused. However, I’m confident that if you do it naturally, and link towards a page that is relevant – it’s fine. Relevant in this context means a page appropriate to the video. Likewise, that video is probably appropriate to that page, so place the video there.

This will not only help funnel referral traffic to your main page. It will improve the user experience on your target page. This is a double whammy.

With so many ways to create animated and short videos on a budget, it makes sense to get a few well-done videos for your YouTube channel to help improve conversions.

58. Reviews & Reputation.

This is a ranking factor that applies most heavily towards local searches. Plenty of reviews and testimonials will boost Google’s confidence in your brand. If you’re a five-star restaurant then there’s a very high chance you’ll be at the top of the SERPs.

For international business, this may help build trust in your website but won’t be a huge driving force behind the rankings. The latest trends suggest that Google is very interested in establishing Expertise, Authority, and Trust. This can help show that your website has trust and is crucial for eCommerce.

 

Backlink Factors


59. Linking Domain Age.

The age of the root domain will help to establish how trustworthy that site is. When you have a highly trustworthy website with lots of authority and linking towards you – that’s going to help a lot. To get the most out of this you just need to network with other site owners. Naturally, as you meet webmasters on the internet you will find some old websites that are happy to share your content.

Checking the age of a linking domain is easy, you can do this through WhoIs checkers such as Nominet and ICANN:

Creation Date on ICANN

60. # of Linking Root Domains.

One of the strongest and most widely abused of the Google ranking factors is referring domains. It has long been accepted that 1 link from 1 domain, is better than 100 links from 1 domain. This makes sense in theory and works in practice.

However, this is a great opportunity to remove ambiguity around some terms. These terms I often hear used interchangeably, but actually have very different meanings.

  • Backlink, is a single anchor from another website to your own.
  • Referring Page, is a page that links to you one or more times.
  • Referring Domain, is a root domain that links to you one or more times.

Therefore, it’s possible to have a single backlink, on a single page, across the entire domain. It is equally possible to have one link on every page, or many links on a single page. The importance of understanding this arises from the disavow file.

61. Alt Tag (for Image Links).

Since an image link doesn’t pass any relevance signals, Google uses the alt text instead. This helps for Google to understand the relationship between the two pages and the content of the image. This is great for external and internal link building.

62. Links from .edu or .gov Domains.

this generally works really well. The whole point of a citation based algorithm is that authoritative government and education websites have lots of power.

In recent years people have used scholarship programs to connect with Universities for backlinks. There’s nothing inherently wrong with a scholarship or getting backlinks towards your website. However, if you grab hundreds of education pages – this can flag you up.

The same isn’t necessarily true for government websites. These tend to be a lot harder to manipulate and acquire links from. So if the house of parliament is linking towards you – then it’s a great sign.

63. Authority of Linking Page.

Each page has an individual PageRank, and it’s based on the referring domains towards that page. There are other ranking factors that impact the PageRank, but the most important one is the number of referring domains.

Therefore, if you want to improve your link building, you should consider Tier 2 links. This is the act of building links towards websites that already link towards you, and it has a big impact. Furthermore, it’s safe for your money site and will unlikely hurt you if the referring page is penalised.

Tiered Link Building

64. Authority of Linking Domain.

The more authority a website has, the greater the weight it transfers to your site. This is how Domain Rating and Domain Authority work. Another great metric is TrustFlow, which also uses proximity from their manual seed list of trustworthy websites.

I love to look at the number of keywords and traffic a site receives. This helps to see whether Google considers the website trustworthy, not just third-party metrics. If you combine this with other metrics then your analysis will be well balanced.

65. Social Shares of Referring Page.

During the past few years, there has been a move for Google towards social proof and how it impacts trust. Websites that have a lot of shares, comments or testimonials typically perform better than those without. This creates an ideal candidate for link opportunities.

Any website that links towards you and has a large number of social shares, is likely to be a good quality link. This doubles up with the fact that well-liked content is probably also well linked. The authority of that page will pass through to your website and you’ll gain trust.

66. Links from Bad Neighborhoods.

Linking towards websites with phishing scams or malware is bad for your site. But it’s also bad for those sites to link towards you. Thankfully most of the internet is made up of pages that are not harmful, so this isn’t often a consideration.

If your website is suffering from a link related devaluation or manual action – check for these sites. If you also have a lot of inbound spam, checking if they have a common IP address can help to identify them all. Try to maintain a good backlink profile without pruning too much.

67. Guest Posts.

The idea of guest posting is seen as a taboo by most – but that’s because people often do it wrong. When guest posting is done properly, it’s a great way of bringing trust to both websites.

Here’s an example:

I wrote an article for Matt Diggity, it’s about site crawlability. This is a great opportunity for me to collaborate with other industry leaders. It’s also great for his audience to get a fresh perspective. Therefore, it brings value to his community and helps me to build trust for my brand.

But this wouldn’t work if we let anyone and everyone guest post on our website. We need to be certain that the other person is credible because this ensures the integrity of our brands. Therefore, I encourage you to accept and seek guest post opportunities.

However, I would recommend you exercise some caution. Be sure to find reliable partners and build a strong network.

Site Crawlability on Diggity Marketing

68. Nofollow Links.

Links that are marked as Nofollow are often considered as passing no authority to another website. However, the anchor text can still be useful for Google to understand what your page is discussing. Therefore, I wouldn’t say that these links are completely irrelevant – even if they don’t bring any power to your rankings.

If you do comments and use links to your website, that’s completely fine. However, it’s important to note that the anchor text will be your name and will not pass relevance signals. For my own website, this is totally fine, because I also want to be visible when you search my name.

NoFollow Links in AHREFS

69. Link Diversity.

The links towards your website can leave a negative footprint if they are coming from the same source. If every link is from a 500-word guest post, then it will be clear you are manipulating the algorithm. To counter this negative footprint you will want to build a variety of links.

My advice on this would be to avoid link packages that are called pillow links or diversity links. These are often irrelevant websites that most top-level sites never bother to create. Instead, get involved with your community on forums, Reddit, Wikipedia, personal blogs and social media. This will result in links from a variety of sources.

70. “Sponsored Links” Or Other Words Around Link.

There are a lot of web pages that make clear the content is a sponsored link or guest post. This is a red flag to Google. Text that suggests to Google you paid for your link will likely result in a manual penalty if you have large quantities of them.

Here’s an example of the type of link you want to avoid having in large quantities:

Sponsored Guest Posts

71. Contextual Links.

A link that is placed inside the main content is going to provide the most power, but these can be out of context. For example, if you select any random word to pass a link, it will pass relevance signals from that anchor text. So to get the most power from your links you will want a contextual link that makes sense.

This also requires the referring page to have relevant content to your website. If the content is discussing a recipe and you sell headphones then there is little contextual relevance between the two pages. So I would suggest that you find pages that are meaningful and related to you.

72. Excessive 301 Redirects to Page.

Like anything in search engine optimisation, when something good comes around people overuse it and ruin it for everyone. This is true with using 301 redirects towards your website and pages. There is a history of people redirecting lots of websites to power up their pages.

That’s not to say a single site-wide redirect or page-level redirect is going to cause you issues. To the contrary, a good redirect can do great for powering up your site and boosting your authority. I would consider using redirects sparingly.

73. Backlink Anchor Text.

if external websites pass backlinks towards you with a targeted anchor text – you’re going to rank really well. In fact, it’s so good – Google often polices this.

Anchor text over-optimisation is quite easy to detect, and so people need to be really careful with how many targeted anchors they use. Branded anchor text is a safe bet, but doesn’t pass the same punch for your target keywords.

This is another one that Google has confirmed matters, and all my tests show that it works extremely well. I strongly recommend manipulating backlink anchor text if you want to rank effectively.

anchor text percentages

74. Country TLD of Referring Domain.

Links from the same location you are targeting may help Google to connect you with that audience. If your target audience is international, then you will not need to be concerned where the links come from as much. But there are some low-quality top-level domains that you should review:

  • .ru
  • .us
  • .cn
  • .gf
  • .ga

As always, it’s great to review every referring domain regardless of the top-level domain. But in my experience, if there is spam pointed at your website there will be more using that same ccTLD. This is because it’s easy for spammers to create hundreds of similar websites in no-time.

CTLD Distribution in AHREFS

75. Link Location In Content.

Links that are located towards the start of the content are often believed to hold more weight than those towards the bottom. Given the value of above-the-fold content and the first paragraph; it’s not crazy to think this may be the case.

However, it’s my view that whilst it’s nice to secure important links early in the content – the difference is small. Simply having a link from another website holds significantly more power than whether the link is towards the introduction or conclusion of your content.

76. Link Location on Page.

Not every link from another website is rated the same. For example, if you’re linked in part of the main content, this is called a contextual link. However, links from the sidebar or footer would be called supplementary links.

Since the main content is the most important part of the page, it’s universally accepted as the strongest link. Between the header, footer, and sidebar – there isn’t as much difference. Therefore, the main goal should be to get contextual links.

77. Domain Relevance.

A link from a domain that is relevant to yours will help to push up your rankings. If you’re a tech company then you will want to get tech product reviews. This makes sense because the domain has established itself within that niche and industry.

These are factors that Google consider trustworthy for your own website. It, therefore, makes sense for a referring domain to benefit from those same ranking factors. Try to network with others websites in your industry and this can lead to the best quality links.

78. Page Relevance.

Receiving a link from a domain that is irrelevant does not make it bad. If the content of that page is relevant and your link is contextual, then it will still benefit your website. This is why you need to focus on high-quality link building.

Typically, a link provider will buy mediocre content that is relevant to your website. But if you want to be placed at the top and stay there, you will want to take control of the content yourself. This can be really hard if you are looking to scale the link building by purchasing links. So be sure to balance this with other link types.

79. Text Around Link Sentiment.

The sentiment of the content surrounding a link will impact the relevance of the link. This is something that requires good link placement and contextual links. However, when discussing sentiment it’s slightly different.

If the link is surrounded by content that suggests your website is low-quality and contains viruses, then that’s not good. You want links that have a positive sentiment about your website. The content should be singing you praise or discussing how you fulfil those needs.

80. Keyword in Title.

Earning a link from a website is always going to be great for your site – especially a website that is relevant. Another factor that will boost your rankings is keyword placement in the page title. This is a relevance signal for Google and will help them to find a relationship between pages.

This is something that private blog networks can be great for achieving. It’s an easy way to control the domain and page relevance, as well as keyword prominence. If you combine all of these with an inbound link that includes your keyword – you will see a huge result. But be careful, this type of strategy is a clear over-optimisation and needs temperament.

81. Link Velocity.

An increasing link velocity shows to Google that you are both an active website and a growing brand. This is something that is hard to achieve through natural outreach and manual networking. It is something that you want to achieve, but it should not be forced through low-quality links.

It is much better to focus on developing your product, service and marketing than spamming your website. A steady velocity of around 10 – 12 links per month will have a big impact still and eventually, with enough visibility, you may grow faster.

82. Linked to as Wikipedia Source.

The general public loves to use Wikipedia for citations, interesting facts, and even time-wasting games. It’s an enormous hub of knowledge and a great part of the internet. To maintain the quality, there are moderators and trustworthy users that can frequently submit edits.

If your website is used as a citation for any given topic, it will help with ranking. This can be great if you are a major organization to also have a Wikipedia page. Even better is to be associated with pages that have lots of great quality citations to trustworthy sources.

83. Backlink Age.

The longer a web page has been alive, the more trustworthy it becomes. This is especially true for evergreen content that does not lose value over time. Any page that has several years of an active community will hold a lot of value for link building.

Earning a link from a well-established piece of content and hours of editorial time will bag you a nice amount of trust and authority. To acquire links of this calibre, first, you need to create content that is link-worthy. Pages with unique statistics and data can be great for this.

84. Links from Real Sites vs. Splogs.

Any blog designed to spam websites and improve rankings are not a great source of authority or trust. Whilst their anchor text could still help with page relevance, it’s not worth the risk. Being caught with too many low-quality sites could trigger a Penguin penalty.

It is much better to look for websites that are high-quality and open to collaboration. If they have a lot of keywords and traffic, as well as a unique selling point, they’re a great candidate. Reach out to webmasters with these types of websites.

85. Natural Link Profile.

This is often overlooked by most hobbyist SEO’s and can be detrimental to the site.

If you’re looking at domain metrics such as Domain Authority, Domain Rating, or Trust Flow – you’re probably not looking at the right data. For example, below is a screenshot of a domain with high DR. The only problem is, that the website has a penalty.

It doesn’t matter what the metrics tell you if the website is penalised. So it’s worth being cautious, and considering whether your backlink has a natural link profile.

manual penalty visible from ahrefs

86. Anchor Text from 301.

Whilst a well placed site-wide redirect can boost your rankings, it also has the power to disrupt them. One of the most commonly overlooked ranking factors is the anchor text. This is how Google is determining relevance for your website.

A site-wide redirect brings with it lots of branded and irrelevant anchor texts that can hurt. I’ve seen websites that redirect completely unrelated domains that then stop them ranking for their brand terms. This is a strong ranking factor and can have a large impact.

87. Outbound Links.

Each page on your website holds a certain amount of authority and trust. This can be spent without much cost by linking internally and externally. Doing so will help users to navigate your website and find other relevant content that is helpful.

When we look at a backlink profile, any website that links out to a lot of websites will pass less of that authority. This is because the authority is distributed towards all those sites. The more websites that share authority, the smaller the piece of the pie you will earn.

When looking for links that are high-quality, you want to share with no more than around five other websites. This is especially important when choosing tier 2 link opportunities. You want your website to benefit the most and not your competition.

88. Forum Profile Links.

Links on user profiles are often picked up and accounted for in the algorithm. However, the value of these links are severely diminished. They carry very little weight, but do pass relevance signals towards the chosen page.

Abusing these is rarely a great idea and you should only do this if you’re actively engaged in an online community. A great example would be Reddit. This community is active and there are hourly updates, so being present here is great for linking.

89. Thin Content & Low-Quality Backlinks.

There is unlikely a rule that content needs to have a certain length. This is something that Google has publicly stated regarding your own content. So it seems unlikely that a minimum content requirement is in place for backlinks.

If you are looking for high-quality backlinks, you will want to be providing value to the user. This will mean relevant domains, relevant pages, with optimised page titles, headings and content. The more authority that page can attract, the better it is for you.

With all of that in mind, aiming to collaborate and build a great content strategy is going to be helpful. A lot of the top tier websites will have networked and met people within their industry. This provides them with an opportunity to give all their users great content.

90. Quality of Linking Content.

Content that has lots of misspelling and grammatical faults are rarely considered high-quality. If you want to get the best quality backlinks, then you need to make sure the content is checked through Grammarly. This will help fix the most basic issues.

Higher quality websites will also want lots of editorial work. It should take hours of labour to carefully choose your message. Getting links from authoritative sources will require a much higher standard of work than link sellers.

91. Site-wide Links.

In the past, earning a lot of links from a website was a huge boost to rankings. During this period it was common to see people manipulate site-wide links and low-quality results. Tools such as GSA were used to spam websites.

Since then, Google has changed their tactic. They figured that it is easier to earn 1,000 links on a single website than 1 link from 1,000 websites. This means that site-wide links lost most of the power they used to have. But it can still impact your rankings.

Many people that have site-wide links also choose to optimise the anchor text. This quickly lands them a penalty and stops it from working. However, sites such as WordPress or Colorlib thrive on their site-wide links.

Take a look at how many referring pages Colorlib has compared to their referring domains:

Backlink Profile for Colorlib

User Interaction


92. CTR for a Single Keyword.

The user metrics are considered very important in the top 10 positions of Google. Specifically, the click-through rate is considered one of the strongest ranking factors. It’s a good sign that users like your page title and meta description. If the users are then happy with the content they find, this will help your rankings for that keyword.

The main thing to remember is that you’re serving the user with content. If your content does not match the expectation that you set – it will backfire. Therefore, I would suggest a page title that is focused on keywords you actually help the user with.

93. CTR for All Keywords.

Whilst Google stopped passing through Analytics data for keywords that users click – it doesn’t mean they stopped tracking it.

If you regularly get clicks for all your target keywords, it’s fairly safe to say that you are going to build a strong interaction with this site. Some people would suggest that Google can’t track this, but I think they can.

Google uses a javascript redirect from the SERPs onto the pages, there’s no doubt in my mind they can parse the search query and timestamp.

94. Bounce Rate.

Since Rank Brain came out, there’s been an increased focus on bounce rate as a ranking factor. If you’re always focused on improving your website then this should be easy to optimise. Aside from offering users a better experience, UX design and conversion rate optimisation can help with revenue.

Here’s an example of one of my clients that recently improved their bounce rate and it adds a lot of profit month-on-month:

eCommerce SEO

95. Direct Traffic.

Since Google can collect data from Android, Chrome, and from Google Search; it is clear they can track direct traffic. These users offer Google a great insight into whether people are regularly participating in your online community. It’s a strong sign that your brand is well-known and trustworthy.

96. Repeat Traffic.

Users that repeatedly visit your website offer Google an insight into the trust of your content. This is something that big brands benefit from the most, especially companies like Amazon. It will never be taken down because users always visit directly.

Whilst direct traffic is a great sign that your brand is well-known and trustworthy, a repeat user is more-so. They convey that not only was the content good once, but it’s worth revisiting time and time again. Bookmarks can be particularly great for this long-term.

97. Number of Comments.

Any community that is active will often include comments, reviews, or other feedback mechanisms. This helps Google to see that the page is alive, there are people who are engaged in the content and interacting. Anything that conveys trust is great for Google.

Not only do comments provide hints of a community, it also gives unique content and insight. Questions and answers that were not originally covered can be explored on your page. These types of conversations can be great for long-tail keywords.

98. Dwell Time.

Dwell Time is defined as the time that a user spends on your website before returning to search results. This can include reading your content, visiting other pages, watching videos – all of it. This differs from Time Spent on Page.

With a metric such as Time Spent on Page, the user could visit anything else and it would be cut short. But Dwell Time rewards you for engaging the users. Getting them to interact with your site will show a great user experience and improve rankings.

 

Special Algorithm Rules


99. Query Deserves Freshness.

If a query is about the latest news, events, tickets or future information – it triggers this special algorithm rule. When a query deserves freshness, Google will use trustworthy websites that have been recently updated and accurately reflect recent events.

If you are a ticket vendor and you create a page for a band in town, this will stop being relevant shortly after that event. Holding onto that page will give you very little value long-term, so it is often worth removing those pages when they stop receiving traffic or searches.

100. Query Deserves Diversity.

Some queries deserve a diverse set of results and “Guinness” is a perfect example of this. The keyword returns an assortment of different answers at any given week and constantly changes based on recent events. Here are some of the most popular SERP features:

  • Wikipedia box providing information on the Irish stout.
  • Question box to answer questions about whether Guinness is safe to drink.
  • Map Pack to show local businesses that include Guinness in their name.
  • Video box for instructions content on how to pour, as well as old ads.
  • Recipe items for food that includes Guinness in the instructions.
  • News features for when a Guinness world record is broken.

This is a great example of how lots of user queries can all stem from a single broad search query.

Guinness Wikipedia information

101. User Browsing History.

this is not a part of the core algorithm but applies to users struggling to find what they are looking for. There’s really no way to manipulate this for your advantage – but you should be aware that browsing history is part of the algorithm.

102. User Search History.

similar to above, the search history of a user can impact what they find. For example, if a user searches ‘cheap flights’, and then five minutes later searches ‘south of France’ – it’s fairly safe to assume they’re looking for a holiday.

This may influence the algorithm slightly to push results that are associated with both the phrase ‘cheap flights’ and ‘south of France’. This essentially improves the relevance of the search results for a single user based on what they have recently searched.

Whilst this is something that Google is definitely doing, the importance of this is very low. The level of processing that is required for this is not justified by the difference it brings in results. I would not expect this to be large scale for 1 – 2 years at least.

103. Localised Content.

The geolocation of your server will slightly impact whom Google considers your target audience. However, there are stronger signals for Google to analyse. For example, the target location in Google Search Console, and your top level domain.

These signals have a big impact on your position in the search results. As an example, if your top-level domain is .co.uk, then it will rank poorly in most queries from other countries. However, it is possible to rank in those audiences.

Therefore, it’s important to check any location signals you are passing to Google. You can review your target location in Google Search Console. To do this, navigate through the menu and find Search Traffic > International Targeting. For most people, this will default to the United States.

If you would like to change your location, then you can select a different target in the drop down. However, if you would like to specify no target location, then you should select “Unlisted”.

Targeting a location in Google Search Console

104. Safe Search.

this is an easy one that people often forget. If a user selects safe searching, then all the adult-related content that might be relevant is left out. Again, there’s not really any way to manipulate this, except to not have your website classified as adult entertainment.

If you’re doing a website around pornography or escort services – then the safe search will be against you. If you’re selling girl scout cookies – this is probably a ranking factor to ignore.

105. Google+ Circles.

Google likes to use biased results to personalise your searches. If you’re participating in Google+ circles to follow brands, they may choose to favour these sites. This will be different for each user and cannot be manipulated. Creating a Google+ profile for your business is one way to help users follow you.

106. DMCA Complaints.

When content has received numerous DMCA complaints, it will be withheld from the SERPs. This is to make sure that the original authors and sources are being credited for the work. It therefore overrides and supersedes the other ranking factors.

Having great quality content and strong link profiles will not work if your content was copied. It is always best to use original content and only syndicate content that you are legally allowed to share. For example, product descriptions from a manufacturer.

107. Domain Diversity.

Google likes to provide diversity in their search results. To check this, search in Google for any non-branded keywords. The results are mostly diversified with very few queries that include the same domain multiple times.

This helps the user by providing domain diversity. Increasing your relevance or authority will not overcome this special algorithm rule. However, if you want to dominate the search results, it is worth considering multiple brands.

108. Transaction Searches.

When Google detects that a query is focused on making a transaction, it eliminates top-of-the-funnel pages from the search results. Questions about who, what, where, when and why are less relevant at this stage of the user intent.

Instead, Google will show preference towards pages that are transactional. This will typically show category pages to provide users more choice, but it can also provide product pages to help users purchase quickly.

109. Local Searches.

A local search will return very different results from a broad search. In the past, this only applied to keywords that included “near me” or “in London”. Recently the algorithm has been a lot better at detecting whether the query is local.

When a query is considered localised, the results will have increased emphasis on geolocation and reviews. Other information such as your name, address, and phone number can help. These queries have a very different set of ranking factors.

Map Packs for Local Searches

110. Google News Box.

When something important has happened and is distributed around the news, it can prompt a Google News Box to be at the top of the SERPs. This is a temporary feature that would show up until public interest had died down.

For example, if there’s a product recall that gets media attention, then queries around that product may also include this featured box. This is to help protect users from making a potentially dangerous purchase or bad decision.

News Pack

111. Big Brand Bias.

There is a preference for Google to rank big brands with favour over smaller brands. They do this by putting less emphasis on relevance and authority signals, and more emphasis on the trust signals. The result is that the #1 position is often less optimised.

This is considered a special algorithm rule because it doesn’t apply to all websites, queries, or positions. Attempting to manipulate this will unlikely yield much reward, so your focus should be on improving content, product and service.

112. Shopping Results.

The way that Google handles transactional queries is to push their shopping results towards the top and sides of the page. These shopping result boxes can help users to click through the paid link instead of the organic result.

These types of queries are going to provide fewer clicks than informational queries with a similar volume. This is because a lot of the traffic will go through the shopping results. In these instances, a combination of PPC and SEO can get the most profit.

113. Image Results.

Some searches call for Google to return images instead of text. These types of queries will supercede the trust and authority signals of your content and favour images. To show up in these searches is easy to do.

Upload images to your website with useful filenames. You should include alt text for each image and you may wish to include title text too. Generate an Image Sitemap and submit this to Google Search Console. These images will then have a high chance of being indexed.

Image packs in Google Search

114. Single Site Results for Brands.

When you search for a brand name, the results will often be dominated by that brand’s name. This typically includes links to their homepage, contact us, about us, or core pages.

These are often followed by map packs and links to social media. Here’s an example:

Nike Brand Single Search Results

 

Brand Signals


115. Branded Anchor Text.

Anchor text that is branded can help Google to see an affinity between your website and a brand name. This is why so many people like to use exact match keywords for their domain name. It helps them in the short term to rank for one specific keyword.

The problem with this method is that users may change their buying behaviour. Then your domain is no longer able to serve their needs and you will have to start completely from scratch. Choosing a brand name allows you to pivot over time and match your user’s behaviour.

116. Branded Searches.

If users are searching for products and include your brand, this signals to Google that you’re relevant to that term. They then boost your rankings as a result of being trustworthy. This works like direct traffic and users that return to your site.

The more people that search for your brand – the better.

117. Site Has Facebook Page and Likes.

There is no excuse for not having a company Facebook page in 2018. Any major business or brand should be including a social media campaign. It helps to establish trust and loyalty towards your brand. These customers cost less for repeat visits.

118. Site has Twitter Profile with Followers.

Twitter is another great social media platform for a company to engage with users. It serves a very different audience from Facebook, so it’s worth having a Twitter account. The police force in the United Kingdom uses Twitter to give local community updates.

119. Official Linkedin Company Page.

If you have a company that employs people, you should consider setting up a LinkedIn account. It’s more professional than Facebook and Twitter, so it pulls in a very different crowd of people to network with and potentially hire.

120. Employees Listed at Linkedin.

This helps your colleagues to build up their resume and look for jobs later in their career, but also shows to Google that you’re a real company. There’s no need to force colleagues to create a LinkedIn account, but most of them probably have one.

121. Legitimacy of Social Media Accounts.

The legitimacy of your social media account can be determined by Google using their Fake Users in Online Social Networks Patent. This looks at the type of followers you have, those you are following, frequency of posting and content on your wall.

122. Brand Mentions on News Sites.

Being mentioned or linked to from local media can have a great impact on your local search results. If you’re a big brand that frequently shows up in the national tabloids, you will likely also have lots of authority towards your website.

123. Business Directories.

There are lots of business directories that have been abused by search optimisers. However, some of the higher quality directories are too expensive for most people to abuse. Being listed in these directories can help with referral traffic and building your brand.

One great example of a specialised directory that is high-quality is Treatwell. It’s designed to help users find local salons and book appointments.

124. Physical Store Location. 

The Local Search Results are dominated by websites that provide their name, address, and phone number. Having a physical store can be marked up with Postal Address Schema and set up on Google My Business.

125. Tax Registration.

If your business is recognised by the Government and includes a tax registration, this is a big sign of trust to users and robots. Some companies take their tax policy so seriously that they create an entire page dedicated towards it.

 

On-Site WebSpam Factors


126. Panda Penalty.

The Panda algorithm looks at pages that are thin in content, duplicate, or low quality. It does this by analysing how similar multiple pages are.

A pretty common cause for Panda can be found on each of the major CMS platforms. WordPress generates lots of /tag/, /author/, and /category/ pages.

These are often handled through YoastSEO, which is the more popular SEO plugin for WordPress. By noindexing these pages, you can quickly reduce the amount of duplicate content on your site.

Shopify creates lots of tagged pages, and if you don’t change the liquid theme files, duplicate pages. One of the big issues with Shopify is /collection/collection-handle/products/product-handle URLs. These are often referenced throughout the website. Instead, you can change this to always include /products/product-handle/.

This simple change can essentially halve the size of your website, and improve crawl efficiency; simultaneously reducing cannibalisation at the same time.

127. Links to Bad Neighborhoods.

Linking towards phishing sites, malware sites, or other harmful content can hurt your rankings. These bad neighbourhoods are designed to be harmful and hurt the user. It’s no surprise that Google is very protective of their users.

To keep your website clean, regularly review your external links. Doing so will remove your client errors, redirects, and make sure the content is still fresh and relevant. It also helps avoid situations where competitors buy expired domains to improve their rankings.

128. Redirects.

When dealing with external websites pointing towards you, redirects are quite normal. It’s not possible for you to keep every page you ever created live at the same URL forever.

Naturally, when a page is gone or moved, then you will add a 301 redirect from the old page to the most relevant new page. This improves the internet because people can find relevant content without hitting 404 pages. So this is positive.

However, when we’re dealing on-site, it is possible for you to handle your own redirects. For example, if you move your own article – then you can change your internal links to point towards this page.

Likewise, if you point externally to another website and they change their URL, you should check to make sure that this is a good redirect. You don’t want to be linking to a playground website, only to find out it’s been repurposed. I’ve seen church websites turned into sexual enhancement blog networks – you’ve got to be careful where your redirects take people.

129. Popups or Distracting Ads.

Nobody loves popups, full-screen interstitials or large advertisements in the middle of the text. But all of these are great ways to monetize your page. This is understood by search engines and they do not hurt you for including advertisements.

However, some advertisements can incur devaluation of your page. If these ads are placed on every page, it could become site-wide devaluation. Avoiding this is simple. Make sure it is easy to close the advertisements and avoid distracting from your content.

130. Page Over-Optimization.

It is important to highlight to Google what your page is discussing. This is done through many signals but one of the most important is your keywords in the text. This can be in the page title, heading, anchors and text.

When you include this keyword too many times, the quality of the content deteriorates. You focus on optimising for search engines and not for users. This type of over-optimisation can result in decreased rankings. So it is important to focus purely on user experience.

131. Site Over-Optimization.

To prevent over-optimization at a site-wide level, avoid stuffing keywords into every page. Focusing multiple pages with the same keyword is a fast way to hurt every page. This creates internal conflict and competition and destroys your site.

Another problem is using too many external links to compensate for low-quality content. The typical symptoms are building too many links towards internal pages and lots of targeted anchors. This type of over-optimisation will quickly cause penalties.

manual penalty visible from ahrefs

132. Ads Above the Fold.

Google commits to offering valuable content in the SERPs. This means that any get-rich scheme is likely not putting users first. To combat this, Google really hates any pages with adverts above the fold. It’s looking for content that puts the user first.

However, this does not mean that Google hates advertisements. On the contrary, the rater guidelines are sympathetic that good content needs funding. There are websites that wouldn’t otherwise exist without making money.

However, when this money scheme comes at the expense of the user – this is when Google intervenes. If you have a manual webspam issue, you should check the placement of your ads.

133. Cloaking Content.

Any type of cloaking can be bad, but it’s particularly bad when you try cloaking advertisements. Google is happy for users to monetize the page, but it should not distract from the purpose of the page. If you’re hiding content – you should be making a change.

The four most common ways to cloak or hide content include:

  • CSS display:none.
  • HTTP Headers to serve different content for Google user-agents.
  • iFrames to hide content from Googles crawler.
  • Javascript to hide content from users.

134. Affiliate Sites.

Websites that use affiliate links are often focused primarily on making money, and not serving the user with high-quality content. This is in opposition to what Google values, and so these pages are put under extra scrutiny.

If it appears you’re trying too hard to funnel users to a single page, or have too many affiliate links in general – then it won’t end well. Instead, try to focus on industries you’re proud to be associated with and products you enjoy.

Creating unique and original content around those items can benefit the user. Since I have a degree in Music Technology Specialism, creating a guide on microphones would benefit users. These types of content are what Google values.

135. Autogenerated Content.

Google has stated they are not fans of Automatically Generated Content, and it is part of their quality guidelines. To make sure that you’re getting the most out of your content, you should avoid using any software that writes for you.

The types of services you should avoid include Articoolo. This software generates content that can be used for link building or on-site. When you read the content from the computer, it’s clear to see the artificial technology being used.

Articoolo Logo

136. Excess PageRank Sculpting.

Sculpting the internal and external links towards your website is a great way to highlight the most important pages. The benefits include increased relevance and authority signals, but doing so can cause issues.

To avoid over-optimising PageRank sculpting, you need to remain a balance between the home page and internal pages. The most naturally linked page on your website is the home. When there’s significantly more links to internal pages than the home it can cause issues.

This is particularly true for external PageRank sculpting, and not so much for internal. Whilst it’s still good to link towards your homepage the most, it’s not as essential. To benefit from internal links, try to use contextual links.

137. IP Address Flagged as Spam.

It is rare that Google will flag an individual IP address as poor quality, but it can happen. In this case, every website on that IP address is likely to be negatively impacted. The chances this has happened to you are very slim.

But if none of these other ranking factors applies and you’re on a cheap, shared hosting – it’s worth updating to a premium plan. The move will grant you a fresh IP address, and if you upgrade there may be site speed benefits too.

138. Meta Tag Spamming.

Search engines used to use meta information to understand websites. These are still used to provide semantic information to Google, but are no longer a ranking factor. Google announced that Meta Keywords are Not Used in Web Ranking.

Despite this information being availables for decades, it’s still common to find websites that add meta keywords. A lot of people consider it won’t do any harm, but it also won’t do any good. If Google suspects you’re cheating – it may hurt rankings.

 

Off Page Webspam Factors


139. Unnatural Influx of Links.

A large influx of links can sometimes be considered unnatural or spammy. When those websites are manipulating the algorithm, or they are over-optimised, this can signal foul play to Google. It’s not something that most people need to worry about.

It is quite common for content to go viral and be shared sporadically in a short-term burst. This type of success is usually posted in newspapers and syndicated around the web to generate a buzz. So if your product or service suddenly becomes very popular – this is something that Google can account for.

Unnatural Influx of Links

140. Penguin Penalty.

this is an algorithmic penalty that seriously harms your ability to rank. It’s one that updates regularly and is identified by over-optimised backlinks.

If you have been using press releases, guest post services, private blog networks, link wheels, link trading, or even just spammy outreach campaigns – you’re a potential target for Penguin.

It’s fairly easy to pick up by looking in AHREFS or Webmaster Tools. But here are my tips on how to find Penguin Penalties:

  • Anchor Text – if you check the top 100 anchors in Google Search Console and find lots of targeted anchor text. There’s a chance that you will receive Penguin if you haven’t already. This isn’t saying that targeted anchor text is bad, but when a single keyword shows up in 25% of all your anchors – then you may have overcooked it.
  • Internal Linking – it goes without saying that because this is such a strong ranking factor, it’s heavily policed. Having internal links to your website is GREAT for rankings. But just like eating cake, too much is really bad for you. This can be viewed in Search Console by looking at your most linked pages. Naturally, your homepage should be the most linked page.
  • Low-Quality Backlinks – this is one that is covered across the internet in lots of detail. Needless to say, if you have tons of porn, sex, drugs, and rock’n’roll sites pointing towards you – then these are not great citations. Unless of course, your website is about porn, sex, drugs, and rock’n’roll.

141. Linking Domain Relevancy.

this is a phrase that will come up quite a lot, and it’s really hard to quantify. Majestic took a crack at classifying the internet with their manual review of the internet. However, it doesn’t sit well in the stomach, something seems wrong with this classification.

I think that link domain relevance is important, but how Google calculates this is not clear. Though there are a few things we can expect to be checked. The more of these six points the linking website has, the more relevant the domain is likely to be:

  • Anchors – the anchor is relevant to your page or domain, and useful for users.
  • Content – the content that links towards you is the same topic as the page it links towards.
  • Directory – the directory includes your link with other industry related websites.
  • Domain – the domains that link towards you covers the same topic that your website does.
  • Language – the language of the website that links towards you is the same as yours.
  • Resource – the website that links towards you is using your website as a citation.

Whilst it’s possible to have good domains that do not meet all of these requirements if you look out for these six points – you’re safe.

142. Unnatural Links Warning.

There are two types of unnatural links warning. The ones that impact websites linking towards you, and Google is suggesting you disavow them or remove them – but they haven’t penalised you site-wide. However, the more severe penalty is the site-wide removal.

To check this warning, you can go to Search Traffic > Manual Actions. From here, you can see if the penalty is site-wide or partial.

Regardless of what level of penalty you received, this warning is bad news and will be hurting your rankings. If you have this notice, you will need to fix the problem and submit a reconsideration request.

Unnatural Links Warning in Google Search Console

143. Links from the Same Server.

If you have large volumes of links from the same server, Google may consider this link manipulation. If you’re reaching out and sharing content with real site owners, this should never be a problem. It’s only for the grey-hat and black-hat community.

Google can trace your website to the same server through various methods. If you read my article on Why C-Block IP Address Doesn’t Matter, you will find out how. A lot of these features are built into the internet, so until IPv6, they’re sticking with us.

144. “Poison” Anchor Text.

Some anchor text is considered more toxic towards your website than others. Affiliations with anchor text such as “viagra”, “free rolex watch”, or “nigerian scammer” can harm your results. This is the same for any irrelevant anchor text, but these are particularly poisonous.

To fix this, you’ll need to perform a backlink check and clean your link profile. Removing any keywords with those anchors can quickly help to improve your rankings.

145. Manual Penalty.

there are a lot of people that think Google do not use manual penalties any more. This is simply not true and it is misleading.

At the time of writing, there are 11 common ways to get a manual penalty, as outlined by Google’s Webmaster Manual Actions report.

These 11 common reasons for a penalty include, but are not necessarily limited to the following:

  1. Hacked Website
  2. User-Generated Spam
  3. Spammy Freehosts
  4. Spammy Structured Markup
  5. Unnatural Links to Your Site
  6. Thin Content with Little or No Added Value
  7. Cloaking and/or Sneaky Redirects
  8. Unnatural Links from Your Site
  9. Pure Spam
  10. Cloaked Images
  11. Hidden Text and/or Keyword Stuffing

The two that stand out the most interesting are unnatural links to your website and from your website. It shows that they differentiate both. It’s possible for your website to be penalised from poor quality linking to other websites.

The most unlikely is the keyword stuffing review, which likely dates back to the days of meta keywords. For this reason, part of my on-site audits always suggests removing meta keywords from client websites. They have no place in 2018 and are just pure spam.

146. Selling Links.

If your website is caught selling links, such as sponsored posts – it can cause a huge devaluation in your rankings. Recent updates have seen Google treat those links as nofollow so they pass no authority but the website still remains.

It’s good practice to keep any negotiations about editorial fees to private communication. If the majority of your content is sponsored posts or guest posts – it’s a clear sign that you’re selling links and not focused in creating original content.

147. Link Latency.

There are times when a brand new website appears and suddenly has a large influx of links. These websites could be microsites for commercials or election campaigns. To prevent people from manipulating the SERPs, there’s a ranking latency.

This latency is often referred to as the “Google Sandbox” effect. I’m not a fan of the term, because this affects a very small number of websites. There is, however, a latency between a link being detected and your site being evaluated.

148. Google Dance.

There is a Google patent that people often refer to as the Google Dance. The algorithm changes the position of a document within the index to determine whether it deserves a higher or lower value. This constant movement is where the name comes from.

This is designed for testing pages in the top 10, but it can often be applied when foul play is detected. In this case, the best result is to continue as normal and not make drastic changes. It’s common to see changes in rankings for up to 20 days before stabilisation.

149. Disavow Tool.

This tool is so widely misused that it causes me great pain just talking about it. There are two camps that have formed on the disavow, and I disagree with both of them:

  • Never use the disavow tool, it gives Google too much information.
  • Always use the disavow tool, it gives Google vital information.

My approach is to sometimes use the disavow tool. I will always remove low-quality spam that I know is not positive, to help with anchor text analysis and optimisation. However, most spammy websites provide little or no impact on your rankings. Therefore, a crusade to daily clean this will just waste time.

For me, the best approach is to remove spam at the start of a campaign, monitor for large influxes of junk, and only disavow when necessary.

150. Reconsideration Request.

If your website is hit by a penalty, a reconsideration request can often lead to improved rankings. From my observations, once you’ve submitted a reconsideration request, rankings will typically increase slightly. Then the penalty is lifted and huge improvements in rankings frequently happen overnight.

Here’s an example of a manual link penalty that I recently lifted:

Unnatural Links Penalty Removed Graph

151. Temporary Linking.

There are a lot of people that use temporary linking in attempt to help or hurt rankings. The most common strategy I see is scraping content, getting it indexed and then returning a 404 error. If the website is big enough, it will take months for Google to remove the duplicate content.

The other more commonly used tactic for negative SEO is to use pornographic content. Get this indexed with links back towards the client website, and then change the content back. It will take a considerable amount of time before Google recrawls this content.

The main attraction to the 404s or reverted content, is that when you analyse the site it looks fine. It may even appear to have no links pointed towards your website. This is why you should always check to see the cached version from Google.

 

Categorizing the Google Ranking Factors


152. Expertise.

Google is interested in identifying authors that have expertise. This includes people with degrees, years of experience, unique content and reliable information. Awards are also a great sign of expertise and can be marked up with the award schema.

153. Authority.

There are two common definitions for authority. The first and most popular definition is that more links are equivalent to more authority. This is why metrics such as Domain Authority exist to help you measure your authority.

The second and less commonly used definition is for content authority. Includes internal links towards other pages on the same topic will improve your authority. Not only will it help to show you are authoritative in the niche, it also provides relevance signals.

154. Trust.

Trust is something that Google is very interested in, as well as your users. But demonstrating trust to Google and users can be completely different.

To show your user that you are trustworthy, include lots of research, testimonials, reviews, comments, awards and results. Showing off your success will help the user to see that you’re an expert they can trust.

The second way to show trust is to be mentioned by highly trustworthy sources. Local newspaper and national telegraphs can be a great source of trust for your users. Including social media accounts can help them trust your brand too.

155. Relevance.

The most important and most misunderstood of the Google categories is relevance. The SEO industry is still in its infancy and unlike other forms of marketing, the goal is manipulating algorithms instead of pursuing customers.

Being relevant towards content should be answering their queries and providing clear purpose to your page. You want to deliver the user value, and make this transaction about improving their life – not making yourself rich. With content marketing at the moment this is rarely the case.

156. Purpose.

Every page has a purpose and it’s never the same between two websites. Google recognises your website’s purpose and evaluates you differently based on what you are trying to achieve. Two great examples is Wikipedia and YouTube.

Wikipedia is an open platform for users to contribute their knowledge and expertise into an encyclopedia of information. Most of the pages include thousands of words of content, images, and citations to trustworthy sources.

YouTube is an open platforms for users to share ideas and their content creation. Each page has very little written content and the main purpose is to share videos on topics. There’s no need for Google to treat each both websites as the same and it doesn’t.

157. Freshness.

Users love to read fresh content that is current. Having the most up-to-date and helpful information is a big part of the recent Google updates. It’s about making sure that you have a content strategy in place to help with freshness.

If you check out my Google Trends Guide, you will see that users love seasonal trends. Thousands of users search for popular products, wish lists and recent reviews. This means constantly updating your content and providing the most recent answers to old questions.

 

My name is Rowan Collins, and I am an SEO Specialist based in London. I started SEO back in 2016 after moving from an eCommerce company to an agency. Since then, I have enjoyed years of experience on websites from a plethora of niches. I pride myself in my Christian beliefs and focus on helping others to improve at digital marketing.

Leave a Reply

Your email address will not be published. Required fields are marked *