Brain Dean at Backlinko recently put together a list of the 200 ranking factors Google uses. He noted that some are proven, others are controversial, and some are SEO nerd speculation.
Dean broke his list into nine categories: Domain Factors, Page-Level Factors, Site-Level Factors, Backlink Factors, User Interaction, Special Google Algorithm Rules, Brand Signals, On-Site Webspam Factors, and Off-Site Webspam Factors.
Google’s Matt Cutts said in a video that domain age is not all that important a concern, as “The difference between a domain that’s six months old versus one year old is really not that big at all.” Having a keyword in the domain name does not provide the SEO boost that it used to, but it still acts as a relevancy signal.
A domain that starts with a target keyword has an edge over a website that either does not have that keyword in its domain or has the keyword in the middle or end of its domain.
Google’s patent “Information Retrieval Based on Historical Data” of 03/31/2005 reviewed by WebmasterWorld forum hugely accounted for these rumors implying that Google does look into domain registration (1) and renewal (2) dates:
(1) The date that a domain with which a document is registered may be used as an indication of the inception date of the document.
(2) Certain signals may be used to distinguish between illegitimate and legitimate domains. … Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain and, thus, the documents associated therewith.
Moz’s expert panel agreed that a keyword appearing in the subdomain can boost rankings. A website with volatile ownership or several drops could tell Google to “reset” the website’s domain history, negating links pointing to the domain. In some cases, a penalized domain may carry the penalty over to the new owner. Exact Match Domains could still give a slight edge, but an EMD for a low-quality site could leave it vulnerable to the EMD update.
Going back to Google’s Matt Cutts: “…When I checked the whois on them, they all had “WhoIs privacy protection service” on them. That’s relatively unusual. …Having WhoIs privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.”
If Google identifies a particular penalized WhoIs owner as a spammer, Google likely would scrutinize other websites owned by that person. Having a Country Code Top Level Domain(.cn, .pt, .ca) can help the site rank for that particular country but can limit the website’s ability to rank globally.
Having a keyword in a title tag remains an important on-page SEO signal. According to Moz , title tags that starts with a keyword tend to perform better than title tags with the keyword towards the end of the tag.
While Google does not use the meta description tag as a direct ranking signal, a description tag can still impact click-through-rate, which is a key ranking factor. So a keyword in the description tag can be important.
It can also be valuable for keywords to appear in H1 Tags. H1 tags are a “second title tags” that Google uses as a secondary relevancy signal.
TF-IDF stands for “Term Frequency — Inverse Document Frequency”. This is a technique to quantify a word in documents, we generally compute a weight to each word which signifies the importance of the word in the document and corpus. This method is a widely used technique in Information Retrieval and Text Mining. Google likely uses a sophisticated version of TF-IDF.
Content with more words can cover a wider breadth and are likely preferable in the algorithm compared to shorter, superficial articles. Indeed, a Backlinko study found that content length correlated with SERP position.
Using a linked table of contents can help Google better understand your page’s content. It can also result in sitelinks. Keyword density is not as important as it once was, but you should be aware when you are going overboard because it can hurt you.
Latent Semantic Indexing Keywords in Content (LSI) help search engines extract meaning from words that have more than one meaning. LSI keywords in page meta tags probably help Google discern between words with multiple potential meanings.
There’s a known correlation between depth of topic coverage and Google rankings. Therefore, pages that cover every angle likely have an edge vs. pages that only cover a topic partially.
Both Google and Bing use page speed as a ranking factor. Search engine spiders can estimate your site speed fairly accurately based on your page’s HTML code. Google also uses Chrome user data to get a better handle on a page’s loading time.
While not a direct Google ranking factor, AMP may be a requirement to rank in the mobile version of the Google News Carousel. Does a page’s content match the “entity” that a user is searching for? If so, that page may get a rankings boost for that keyword.
Thanks to Hummingbird, Google can now better understand the topic of a webpage. Identical content on the same website (even slightly modified) can negatively influence a website’s search engine visibility. When used properly, use of the Rel=Canonical tag can prevent Google from penalizing your website for duplicate content.
Images send search engines important relevancy signals through their file name, alt text, title, description and caption, so image optimization is important. Google Caffeine update favors recently published or updated content, especially for time-sensitive searches. Highlighting this factor’s importance, Google shows the date of a page’s last update for certain pages.
The significance of edits and changes also serves as a freshness factor. Adding or removing entire sections is more significant than switching around the order of a few words or fixing a typo. Frequency of page updates also play a role in freshness.
Having a keyword appear in the first 100 words of a page’s content is correlated to first page Google rankings. Having your keyword appear as a subheading in H2 or H3 format may be another relevancy signal.
Many SEOs think that linking out to authority sites helps send trust signals to Google, and this is backed up by a recent industry study. Google may use the content of the pages you link to as a relevancy signal. So keep the theme of your outbound links in mind to ensure accuracy.
Proper grammar and spelling is a quality signal, but content that is scraped or copied from an indexed page it will not rank as well or may not get indexed at all.
Websites that mobile users can easily use may have an edge in Google’s “Mobile-first Index.” Mobilegeddon is a name for Google’s search engine algorithm update of April 21, 2015 that gave priority to websites that display well on smartphones and other mobile devices.
Hidden content on mobile devices may not get indexed (or may not be weighed as heavily) compared to fully visible content. According to a now-public Google Rater Guidelines Document, helpful supplementary content is an indicator of a page’s quality (and therefore, Google ranking).
Content hidden behind tabs may not be indexed. Too many dofollow OBLs can “leak” PageRank, which can hurt that page’s rankings. Images, videos and other multimedia elements can also act as a content quality signal.
The number of internal links to a page indicates its importance relative to other pages on the site, as more internal links equals more importance. Internal links from authoritative pages on domain have a stronger effect than pages with no or low PageRank.
Having too many broken links on a page may be a sign of a neglected or abandoned site. The Google Rater Guidelines Document uses broken links as one was to assess a homepage’s quality. While there is no doubt that Google estimates the reading level of webpages, it is up for debate how much it matters.
Affiliate links themselves probably will not hurt your rankings, but too many could lead to Google’s algorithm paying closer attention to other quality signals to make sure you are not a “thin affiliate site.” Lots of HTML errors/W3C validation issues could be a sign of a poor quality website. A well-coded page could be a quality signal.
A page on an authoritative domain will rank higher than a page on a domain with less authority. And pages with lots of authority (Page’s PageRank) tend to outrank pages without much link authority.
Excessively long URLs may hurt a page’s search engine visibility. Short URLs tend to have a slight edge in Google’s search results. Additionally, a page closer to the homepage may get a slight authority boost vs. pages buried deep down in a site’s architecture.
Google has filed a patent for a system that allows human editors to influence the SERPs. The category the page appears on is a relevancy signal. A page that’s part of a closely related category may get a relevancy boost compared to a page that’s filed under an unrelated category.
Keyword in URL is another relevancy signal. The categories in the URL string are read by Google and may provide a thematic signal to what a page is about. Citing references and sources, like research papers do, may be a sign of quality.
Bullets and numbered lists help break up your content for readers, making them more user friendly. Google likely agrees and may prefer content with bullets and numbers. The priority a page is given via the sitemap.xml file may influence ranking.
Too many outbound links can be a problem, and UX signals from other keywords pages rank for may give Google an internal sign of quality. Although Google prefers fresh content, an older page that’s regularly updated may outperform a newer page.
User friendly layout, parked domains, and useful content are all also page-level factors.
Google has stated that they’re happy to penalize sites that don’t bring anything new or useful to the table, especially thin affiliate sites, so make sure your website has content that provides value and unique insights. Make sure that your contact information matches your whois info.
Many SEOs believe that “TrustRank” is a massively important ranking factor. And a Google Patent titled “Search result ranking based on trust”, seems to back this up. A well put-together site architecture (for example, a silo structure) helps Google thematically organize your content. It can also helps Googlebot access and index all of your site’s pages.
Many SEOs believe that website updates work a site-wide freshness factor, although Google denied that they use “publishing frequency” in their algorithm. A sitemap helps search engines index your pages easier and more thoroughly, improving visibility. Lots of downtime from site maintenance or server issues may hurt your rankings .
Server location influences where your site ranks in different geographical regions. With SSL Certificates, Google has confirmed that it uses HTTPS as a ranking signal, although HTTPS only acts as a “tiebreaker.” Terms of service and privacy pages help tell Google that a site is a trustworthy member of the internet.
Duplicate meta information across your site may bring down all of your page’s visibility. Breadcrumb Navigation is a style of user-friendly site-architecture that helps users (and search engines) know where they are on a site.
With more than half of all searches done from mobile devices, Google wants to see that your site is optimized for mobile users. In fact, Google now penalizes websites that aren’t mobile friendly. There’s no doubt that YouTube videos are given preferential treatment in the SERPs.
A site that’s difficult to use or to navigate can hurt rankings indirectly by reducing time on site, pages viewed and bounce rate. Use of Google Analytics and Google Search Console is important, although Google denies this. A website’s reputation on sites like Yelp.com likely play an important role in Google’s algorithm.
Backlinks from aged domains may be more powerful than new domains. The number of referring domains is one of the most important ranking factors in Google’s algorithm. Links from separate class-c IP addresses suggest a wider breadth of sites linking to you, which can help with rankings. The total number of linking pages — even from the same domain — has an impact on rankings.
Keyword-rich anchor text still sends a strong relevancy signal in small doses. Alt text acts as anchor text for images.
Many believe there is a special place in the algorithm for links from .gov and .edu top-level domains (TLDs). The authority (PageRank) of the referring page has been an extremely important ranking factor for eons. The referring domain’s authority may play an independent role in a link’s value.
Links from other pages ranking in the same SERP may be more valuable to a page’s ranking for that particular keyword. It is also important to get linked to from a set of “expected” sitesin your industry. Links from so-called “bad neighborhoods” may hurt your site.
Although links from guest posts still pass value, they likely aren’t as powerful as true editorial links. According to Google, links from ads should be nofollowed. Links to a referring page’s homepage may play special importance in evaluating a site’s — and therefore a link’s — weight.
Having a certain percentage of nofollow links may indicate a natural vs. unnatural link profile. Having an unnaturally large percentage of your links coming from a single source may be a sign of webspam, so links from diverse sources are a sign of a natural link profile.
Links tagged as “rel=sponsored” or “rel=UGC” are treated differently than normal “followed” or rel=nofollow links. Links embedded inside a page’s content are considered more powerful than links on an empty page or found elsewhere on the page.
Backlinks coming from 301 redirects dilute some PageRank. Internal link anchor text is another relevancy signal. The link title could also be used as a weak relevancy signal.
Getting links from country-specific top level domain extensions (.de, .cn, .co.uk) may help you rank better in that country. Links in the beginning of a piece of content may carry slightly more weight than links placed at the end of the content. Generally, a link embedded in a page’s content is more powerful than a link in the footer or sidebar area.
A link from a site in a similar niche is significantly more powerful than a link from a completely unrelated site. A link from a relevant page also passes more value.
Google gives extra love to links from pages that contain your page’s keyword in the title. A site with positive link velocity usually gets a SERP boost as it shows your site is increasing in popularity. And negative link velocity can significantly reduce rankings as it’s a signal of decreasing popularity.
Getting links from pages that are considered top resources (or hubs) on a certain topic are given special treatment. A link from a site considered an “authority site” likely pass more juice than a link from a small, relatively unknown site. Many believe there is considerable value in getting a link from Wikipedia.
The words that tend to appear around your backlinks helps tell Google what that page is about. According to a Google patent, older links have more ranking power than newly minted backlinks. Google probably gives more weight to links coming from “real sites” than from fake blogs.
A site with a “natural” link profile is going to rank highly and be more durable to updates than one that has obviously used black hat strategies to build links. Google’s Link Schemes page lists“Excessive link exchanging” as a link scheme to avoid.
Google can identify user generated content (UGC) compared to content published by the actual site owner. Links from 301 redirects may lose a little bit of juice compared to a direct link. Pages that support microformats may rank above pages without it.
The trustworthiness of the site linking to you determines how much “TrustRank” gets passed on to you. A link on a page with hundreds of external links passes less PageRank than a page with a handful of outbound links, so the number of outbound links on the page matters.
Because of industrial-level spamming, Google may significantly devalue links from forums. A link from a 1000-word post is usually more valuable than a link inside of a 25-word snippet. Links from poorly written or spun content do not pass as much value as links from well-written, content. Matt Cutts confirmed that sitewide links are “compressed” to count as a single link.
RankBrain is Google’s AI algorithm. Google has said pages that get clicked more in CTRmay get a SERP boost for that particular keyword. A site’s organic CTR for all keywords it ranks for may be a human-based, user interaction signal.
A recent study by SEMRush found a correlation between bounce rate and Google rankings.
Websites with lots of direct traffic are likely higher quality sites vs. sites that get very little direct traffic. Sites with repeat visitors also may get a Google ranking boost.
“Pogosticking” is a special type of bounce in which the user clicks on other search results in an attempt to find the answer to their query. While Google discontinued the blocked sites feature in Chrome, Panda used this feature as a quality signal and Google may still use a variation of it. We know that Google collects Chrome browser usage data, so pages that get bookmarked in Chrome might get a boost.
Pages with lots of comments may be a signal of user-interaction and quality. Google pays very close attention to “dwell time”: how long people spend on your page when coming from a Google search. This is also sometimes referred to as “long clicks vs short clicks.”
Google gives newer pages a boost for certain searches. Google may add diversity to a SERP for ambiguous keywords. Websites visited frequently get SERP boosts. Search chains influence search results for later searches.
Google chooses Featured Snippets content based on a combination of content length, formatting, page authority and HTTPs usage. Google gives preference to sites with a local server IP and country-specific domain name extension. Search results with curse words or adult content will not appear for people with Safe Search turned on.
Google has higher content quality standards for “Your Money or Your Life” keywords. Google “downranks” pages with legitimate DMCA complaints. The so-called “Bigfoot Update” supposedly added more domains to each SERP page. Google sometimes displays different results for shopping-related keywords.
For local searches, Google often places local results above the “normal” organic SERPs. Certain keywords trigger a Top Stories box. Google began giving big brands a boost for certain keywords. Google sometimes displays Google Shopping results in organic SERPs.
Google images sometimes appear in the normal, organic search results. Google has a dozen or so Easter Egg results. Domain or brand-oriented keywords bring up several results from the same site. Payday Loans Update is a special algorithm designed to clean up “very spammy queries.”
Branded anchor text is a simple — but strong — brand signal. If people search for your brandin Google, this shows Google that your site is a real brand. If people search for a specific keyword along with your brand, Google may give you a rankings boost when people search for the non-branded version of that keyword in Google.
Brands tend to have Facebook pages with lots of likes. Twitter profiles with a lot of followers signals a popular brand. Most real businesses have company Linkedin pages. In February 2013, Google CEO Eric Schmidt famously claimed:
“Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results.”
Google filed a patent for determining whether or not social media accounts were real or fake. Really big brands get mentioned on Top Stories sites all the time. Google likely looks at non-hyperlinked brand mentions as a brand signal. It’s possible that Google fishes for location-data to determine whether or not a site is a big brand.
Sites with low-quality content (particularly content farms) are less visible in search after getting hit by a Panda penalty. Linking out to “bad neighborhoods” — like spammy pharmacy or payday loan sites — may hurt your search visibility. Sneaky redirects can get a site not just penalized, but de-indexed.
The official Google Rater Guidelines Document says that popups and distracting ads are a sign of a low-quality site. Google may penalize sites that display full page “interstitial” popups to mobile users. Google does penalize people for over-optimizing their site, including keyword stuffing, header tag stuffing, excessive keyword decoration.
A Google Patent outlines how Google can identify “gibberish” content, which is helpful for filtering out spun or auto-generated content from their index. Google does not like websites that use Doorway Pages. The “Page Layout Algorithm” penalizes sites with lots of ads (and not much content) above the fold.
Going too far when trying to hide affiliate links (especially with cloaking) can bring on a penalty. A nickname given to a series of Google updates starting in 2017, Fred “targets low-value content sites that put revenue above helping their users.” It’s no secret that Google isn’t the biggest fan of affiliates, and many think that sites that monetize with affiliate programs are put under extra scrutiny.
Google understandably hates autogenerated content. Going too far with PageRank sculpting— by nofollowing all outbound links — may be a sign of gaming the system. If your server’s IP address is flagged for spam, it may affect all sites on that server. If Google thinks you are adding keywords to your title and description tags in an effort to game the algo, they may hit your site with a penalty.
If your site gets hacked it can get dropped from the search results. A sudden (and unnatural) influx of links is a sure-fire sign of phony links. Websites that were hit by Google Penguin are significantly less visible in search. Lots of links from sources commonly used by black hat SEOs (like blog comments and forum profiles) may be a sign of gaming the system.
A high-percentage of backlinks from topically-unrelated websites can increase the odds of a manual penalty. Google has sent out thousands of “Google Search Console notice of detected unnatural links” messages. According to Google, backlinks from low-quality directories can lead to a penalty.
Google frowns on links that are automatically generated when user embeds a “widget” on their site. Getting an unnatural amount of links from sites on the same server IP may help Google determine that your links are coming from a blog network. Having “poison” anchor text pointed to your site may be a sign of spam or a hacked site. A 2013 Google Patent describes how Google can identify whether or not an influx of links to a page is legitimate. Articles directories and press releases has been abused to the point that Google now considers these two link building strategies a “link scheme” in many cases.
SERP Matrix knows and understands local SEO. We can assist in helping you get more reviews, creating uniform NAP data, and optimizing your online content.
Our team will be able to work with you on search engine submission. We will know how to perform the necessary keyword research and implement the keywords in the strongest possible way so you can rank as highly as possible on all local searches.
We also handle web design and development for younger website that are still in need of the basic beginning elements. No matter where you are in the local SEO process, SERP Matrix can be with you to make sure that you are as effective as possible online.
If you are seeking local SEO assistance in the greater Houston area of Texas, make sure that you take the time to speak to SERP Matrix. We can refresh your design or completely rebuild your website if necessary.
Our team will work closely with you throughout the entire SEO process so you can expect to be completely informed about every step of reaching your rankings goals. Call (713) 287-1134 or contact us online to have us discuss your case with you in greater detail during a free consultation.