Google Ranking Factors – The Bottom Line
To a lay man, google ranking factors sounds simple but whenever talks of algorithms and SEO starts coming into play… then comes the complexity. But in the real truth of it, google ranking factors can be quite crucial for you when promoting your blog, webpage, or any other site for that matter. There is a TON of hearsay revolving around actual ranking factors and much of it is downright wrong and dangerous. Some can actually hurt your website and get you de-listed.
Studies we conducted have shown that most of these guidance happens to be somewhat incorrect or entirely erroneous. The guidance and effects stemming from this hearsay or speculative conjecture seem to breed more and more of the same. And finally, the correct and precise “factors” left often aren’t factors that Google considers at all, yet speculative strategies that may or may not work.
In spite of all that, we truly know a lot for certain about the way Google ranks web sites.
- Real search engine optimization (SEO) understanding doesn’t come from a random blogger, forum, or magic “get rich quick” scheme. In order for you to get the most reliable and trustworthy SEO knowledge and details, your sources comes in three form which are;
- Patent filings
- Direct statements from Google and/or their team
- Applying The Scientific Method
This source serves as a broad and comprehensive guide, researched and thoroughly vetted as to how Google ranks sites.
- Factors considered to be debatable or even absolute fallacy are what we have included but give room to categorize and sort out notions that are less-validated. This source is also restricted to what factors matter when ranking in Google’s primary web search for non-local requests. This is important to explain since local SEO, image-only search, video search, and every other Google search engine plays by just minimal differences in rules.
On-page SEO describes the direct factors that you as a website owner are able to use through the management of your website. Positive factors are those which favors and assist you to rank better. The misuse of several of these factors when prolonged beyond its elastic limits tends to turn negative. We will discuss about negative ranking factors later.
In extensive expressions, positive on-page ranking factors connects synergistically creating the theme of content, accessibility through several environments, and a positive user experience.
Positive On-Page Factors
Kеуwоrd in URL
So аѕ tо help in сrеаting ѕignifiсаnсе оf a part of соntеnt fоr a ѕресifiс ѕеаrсh ԛuеrу, thе kеуwоrd аnd рhrаѕеѕ thаt рорѕ up in thе раgе URL, outside оf thе dоmаin nаmе serves thiѕ function. Aѕ URLs bесоmе lоngеr or as keywords are used repeatedly, diminiѕhing returns are ѕееminglу attained.
Reference(s): Pаtеnt US 8489560 B1, Matt Cutts
Keyword in Title Tag
In order to define the title of a document or page on your site, title tags are of essence, and frequently shows up in both the SERP and as snippets for social sharing. Depending on the characters, the title tag shouldn’t be longer than 60-70 characters (Moz Tool). As with URL, keywords closer to the beginning are broadly hypothesized to carry more weight.
Reference (s): US 20070022110 A1
Words with Noticeable Formatting
In determining the more significant subject matter of a page, words showing up in headlines have the most weight than keywords in bold, italic, underline, or larger fonts. This is established by Matt Cutts, SEOs, and a patent that states: “matches in text that is of larger font or bolded or italicized may be weighted more than matches in normal text.”
Reference (s): Matt Cutts, Patent US 8818982 B1
Keyword in ALT Text
The ALT quality of an image serves to describe that image to search engines which are unable to display the image. In so doing, significance is being established particularly for the image search and at the same time refining its accessibility.
Reference (s): Matt Cutts
Keyword Stemming
Keyword stemming refers to the practice of taking the root or ‘stem’ of a word and finding other words that share that stem (ie. ‘stem-ming’, ‘stem-med’, etc.). Evading this, such as for the sake of a keyword density score, results in poor readability and has a negative impact. This was introduced in 2003 with the Florida update.
Reference (s): Matt Cutts
Internal Link Anchor Text
All together for a client to know where a connection drives, the stay content of a connection serve this capacity. It serves as a noteworthy piece of route inside of your site, and when not mishandled, helps with building up how vital a sure bit of substance is over uncertain substitutes, for example, “click here”.
Reference(s): Google’s SEO Starter Guide
Keyword Contained in Domain Name
When a keyword or expression happens inside of a domain name, then a positioning reward is ascribed. The weight offered gives off an impression of being less significant than when the domain name is a definite match (Exact Match Domain Name – EMD) of that genuine SEO question, yet more imperative than when a pivotal word appears later in the URL.
Reference(s): Patent EP 1661018 A2
Circulation of Page Authority
Like a various leveled request of things, for you to be give a bigger help your page must be site-connected specifically though different pages connected from the your starting first site gets lesser support. A comparative result is regularly seen from pages connected from the landing page, on the grounds that this is generally the most-connected page on most sites. Making a site construction modeling to take advantage of this component is frequently known as PageRank Sculpting.
Reference(s): Patent US 6285999 B1
Utilization of HTTPS (SSL)
In 2014, another positive positioning variable known as SSL was formally declared, regardless of whether the site took care of client data. Gary Illyes made light of the noteworthiness of SSL in 2015, calling it a sudden death round. In spite of the fact that, for a calculation based on the numeric scoring of billions of site pages, we’ve observed that sudden death rounds frequently roll out the change’s majority on focused inquiry inquiries.
Reference(s): Google, Gary Illyes
Crisp Content
The full name of this one is in fact “crisp substance when question merits freshness”. This term, Query Deserves Freshness (frequently abbreviated to QDF), alludes to hunt inquiries that would profit by more present substance. This does not have any significant bearing to each inquiry, but rather it applies to a considerable amount, particularly those that are instructive in nature. These SEO advantages are only one more reason that brand distributors have a tendency to be extremely effective.
Reference(s): Matt Cutts
Old Content
A Google patent states: “For a few inquiries, more seasoned archives may be more great than more up to date ones.” It goes ahead to depict a situation where a query output set may be re-positioned by the normal period of records in the recovered results before being shown.
Reference (s): Patent US 8549014 B2
Quality Outbound Links
In spite of the fact that it’s workable for outbound connections to “break PageRank”, sites shouldn’t be deadlocks. Google rewards definitive outbound connections to “great locales”. To cite the source: “parts of our framework urge connections to great locales.”
Reference(s): Matt Cutts
Versatile Friendliness
Versatile benevolent sites are given a critical positioning point of preference. For the present, the positioning ramifications of this seems to relate just to clients looking on cell phones. This advanced into the standard SEO discussion and turned out to be more serious amid the Mobilegeddon upgrade in 2015, despite the fact that specialists were guessing on this theme for about 10 years past.
Reference(s): Various Studies
Negative On-Page Factors
Negative Ranking Factors are things you can do that mischief your current rankings. These elements fit into three classes: openness, depreciations, and punishments. Availability issues are simply staggering focuses for Googlebot that could keep your site being crept or investigated legitimately. Ad evaluation is a marker of a lower quality site and may keep yours from excelling. A punishment is much more genuine, and may have a staggering impact on your long haul execution in Google. At the end of the day, on-page components are those that are under your immediate control as a piece of the immediate administration of your site.
Closer view Matches Background
Another regular issue that realizes shrouding punishments happens when the frontal area shading matches the foundation shade of certain substance. Google may utilize their Page Layout calculation for this to really take a gander at a page outwardly and avoid false positives. We would say, this can at present happen inadvertently in a modest bunch of situations.
Reference(s): Google
Single Pixel Image Links
When a mainstream webspam strategy for camouflaging shrouded connections, there’s no doubt that Google will treat “just tiny connections” as concealed connections. This may be finished by a 1px by 1px picture or just truly staggeringly little content. In case you’re endeavoring to trick Google utilizing such systems, chances are absolutely that they’re going to catch you in the long run.
Reference(s): Google
Vacant Link Anchors
Shrouded Links, albeit frequently actualized uniquely in contrast to Hidden Text by means, for example, unfilled stay content are likewise prone to welcome shrouding punishments. This is unsafe region and another once boundless webspam strategy, so make certain to twofold check your code.
Reference(s): Google
Copyright Violation
Distributed substance in a way that is infringing upon the Digital Millennium Copyright Act (DMCA) or comparative codes outside of the U.S. can prompt a serious punishment. Google endeavors to dissect unattributed sources and unlicensed substance consequently, however clients can go so far as to report conceivable encroachment for manual move to be made.
Reference(s): Google
Doorway Pages
A site that makes utilization of Doorway Pages, or Gateway Pages, depicts making masses of pages that are planned to be internet searcher points of arrival, yet don’t give quality to the client. A sample of this would be making one item page for each city name in America, bringing about what’s known as spamdexing, or spamming Google’s list of pages.
Reference(s): Google
Broken Internal Links
Broken inner connections make a site more troublesome for web crawlers to list and more troublesome for clients to explore. It’s an indication of a low quality site. Verify your internal connections are never broken.
Reference(s): Pаtеnt US 20080097977 A1, Google viа SEL
Diverted Internal Links
Thе PаgеRаnk calculation conveys with it the typical rot when exploring sidetracks. This is a simple trар tо fаll into, particularly when considering connections to “www” versus “non-www” segments of a site, or locations with/without a trailing cut.
Reference(s): Patent US 6285999 B1, Matt Cutts via SER
Tеxt in Imаgеѕ
Gооglе has mаkе ѕоmе аmаzing progress at breaking dоwn picture, hоwеvеr all in all, it’ѕ imрrоbаblе that content thаt уоu introduce in riсh mеdiа will be ѕеаrсhаblе in Google. Thеrе’ѕ nо immediate dерrесiаtiоn оr punishment whеn уоu рlасе соntеnt in a рiсturе, it just kеерѕ уоur site frоm having any opportunity tо rank for these words. Reference(s): Matt Cutts
Text in Video
Much the same as with pictures, the words that you use in feature can’t be dependably gotten to by Google. On the off chance that you are distributed feature, it’s to your advantage to dependably distribute a content transcript such that the substance of your feature is totally searchable. This is genuine paying little respect to rich media organization, including HTML5, Flash, SilverLight, and others.
Reference(s): Matt Cutts
Text in Rich Media
Google has make considerable progress at dissecting pictures, features, and different arrangements of media, for example, Flash, yet all in all, it’s far-fetched that content that you show in rich media will be searchable in Google. There’s no debasement or punishment here,
Reference (s): Matt Cutts
Frames/Iframes
Before, web indexes were totally not able to creep through substance situated in casings. In spite of the fact that they’ve beat this shortcoming to a degree, edges do still present a lurching point for internet searcher insects. Google endeavors to partner encircled substance with a solitary page, yet it’s a long way from ensured that this will be prepared effectively.
Reference (s): Google
Meager Content
In spite of the fact that it’s generally been exceptional to compose more expound substance that covers a theme altogether, the presentation of Nanveet’s “Panda” calculation built up a circumstance where content with essentially nothing of one of a kind quality would be extremely rebuffed in Google. An all inclusive perceived contextual investigation on Dani Horowitz’s “DaniWeb” discussion profile pages serves as a great illustration of Panda’s most essential impacts.
Reference(s): Google, DaniWeb Study
Domain-Wide Thin Content
For quite a while, Google has tried to comprehend the quality and novel worth displayed by you’re content. With the Panda’s presentation calculation, this turned into an issue that was scored space wide, instead of on a page-by-page premise. As being what is indicated, it’s currently typically advantageous to enhance the normal nature of substance in web crawlers, while utilizing “noindex” on pages that are bound to be redundant and uninteresting, for example, online journal “label” pages and discussion client profiles.
Reference(s): Google
An excess of Ads
Pages with an excess of promotions, particularly over the-fold, make a poor client encounter and will be dealt with in that capacity. Google seems to construct this with respect to a genuine screenshot of the page. This is a Page’s element Layout calculation, additionally quickly known as the Top Heavy Update.
Reference(s): Google
Duplicate Content (third Party)
Copy content that shows up on another site can realize a huge degrading notwithstanding when it’s not infringing upon copyright rules and appropriately refers to a source. This falls in accordance with a running subject: content that is truly more remarkable and unique against a web’s scenery all in all will perform better.
Reference(s): Google
Duplicate Content (Internal)
Like when substance copied from another source, any piece of substance that is copied inside of a page or even the site in general will persevere through an abatement in worth. This is a greatly basic issue and can crawl up from anything going from an excess of recorded label pages to www versus non-www renditions of the locales to variables annexed to URLs.
Reference(s): Google
Connecting to Penalized Sites
This was presented as the “Awful Neighborhood” calculation. To quote Matt Cutts: “Google trusts destinations less when they connection to spammy locales or terrible neighborhoods”. Straightforward as that. Google has proposed utilizing the rel=”nofollow” quality in the event that you must connection to such a site. To quote Matt once more: “Utilizing nofollow disassociates you with that area.”
Rеfеrеnсе(ѕ): MC: Bаd Neighbors, MC: Nоfоllоw
Slоw Wеbѕitе
Slоwеr lосаlеѕ wоn’t rank аnd also quicker dеѕtinаtiоnѕ. Thеrе аrе inсаlсulаblе dеviсеѕ tо hеlр with еxесutiоn еxаmining fоr bоth ѕеrvеr-ѕidе аnd customer side variables, аnd they оught to bе utilizеd. This соmроnеnt iѕ executed in light of thе intеndеd intеrеѕt grоuр, so truly соnѕidеr the tороgrарhу, gаdgеtѕ, аnd аѕѕосiаtiоn расеѕ оf уоur сrоwd.
Reference(s): Google
Page NoIndex
In the event that a page contains the meta tag for “robots” that transporters a quality “noindex”, Google will never put it in its file. On the off chance that utilized on a page that you need to rank, it’s an awful thing. It can likewise be something to be thankful for when uprooting pages that will never be useful for Google clients, and hoist the normal experience on guest landing from Google.
Reference(s): Logic
Prohibit Robots
In the event that your site has a document named robots.txt in the root index with a “Prohibit:” articulation took after by either “*” or “Googlebot”, your site won’t be crept. This won’t expel your site from the file. However, it will keep any upgrading with new substance, or positive positioning variables that encompass age and freshness.
Reference(s): Google
Poor Domain Reputation
Space names keep up a notoriety with Google after some time. Regardless of the possibility that a space changes hands and you are currently running a totally diverse site, it’s conceivable to experience the ill effects of webspam punishments acquired by the poor conduct of past proprietors.
Reference(s): Matt Cutts
Meta or JavaScript Redirects
A fantastic SEO punishment that isn’t excessively basic any longer; Google suggests not utilizing meta-revive and/or JavaScript timed sidetracks. These befuddle clients, incite skip rates, and are tricky for the same reasons as shrouding. Utilize a 301 (if lasting) or 302 (if makeshift) divert at the server level.
Reference(s): Google
Content in JavaScript
While Google keeps on enhancing at slithering JavaScript, there’s still a reasonable chance that Google will experience difficulty creeping substance that is printed utilizing JavaScript, and further worry that Googlebot won’t completely comprehend the setting of when it gets printed and to whom. While printing content with JavaScript won’t bring about a punishment, it’s an undue danger and along these lines a negative component.
Reference(s): Matt Cutts
Poor Uptime
Google can’t (re)index your site on the off chance that they can’t achieve it. Rationale additionally would direct that a site that is untrustworthy likewise prompts a poor Google client experience. While one blackout is unrealistic to be decimating to your rankings, accomplishing sensible uptime is imperative. Maybe a couple days ought to be fine. More than this will bring about issues.
Reference(s): Matt Cutts
An excess of External Links
As a straightforward capacity of the PageRank calculation, it’s conceivable to “break PageRank” out from your space. Note, be that as it may, that the negative element here is “too much” outside connections. Connecting out to areasonable number of outer destinations is a positive positioning element that is affirmed by Mr. Cutts in the same source article
Reference(s): Matt Cutts
Search Results Page
As a rule, Google needs clients to arrive on substance, not different pages that look like postings of potential substance, similar to the Search Engine Results Page (SERP) that such a client just originated from. On the off chance that a page looks a lot like a list items page, by working as only a collection of more connections, it’s prone to not rank also. This may likewise apply to blog entries outranking label/classification pages.
Reference(s): Matt Cutts
Automatically Generated Content
Machine-produced substance that is based upon client hunt question will “completely be punished” by Google and is viewed as an infringement of the Google Webmaster Guidelines. There are various systems that could qualify which are point by point in the Guidelines. When special case to this guideline has all the earmarks of being machine-created meta labels.
Reference(s): Matt Cutts, Webmaster Guidelines
Phishing Activity
In the event that Google may have motivation to mistake your site for a phishing plan, (for example, one that expects to reproduce another’s login page to take data), get ready for a ton of pain. Generally, Google basically utilizes a sweeping portrayal of “unlawful action” and “things that could hurt our clients”, yet in this meeting, Matt particularly says their hostile to phishing channel.
Reference(s): Matt Cutts
“Orphan” Pages
Vagrant pages, significance pages of your webpage that are troublesome or difficult to discover utilizing your inner connection structural engineering, can be dealt with as Doorway Pages and go about as a webspam signal. At least, such pages likely don’t profit by inside PageRank, and in this manner have far less power.
Reference(s): Google Webmaster Central
HTTP Status Code 4XX/5XX on Page
In the event that your web server returns essentially something besides a status code of 200 (OK) or 301/302 (sidetrack), it is suggesting that the proper substance was not showed. Note that this can happen regardless of the fact that you have the capacity to see the planned substance yourself in your program. In situations where substance is really missing, it’s been elucidated by Google that a 404 mistake is fine and really anticipated.
Reference(s): Speculation
Positive Off-Page Factors
Off-Page Factors depict occasions that happen some place other than on the site that you straightforwardly control and are attempting to enhance execution of in the rankings. This for the most part takes the type of backlinks from different destinations. Positive Off-Page Factors for the most part identify with an endeavor to comprehend fair, common prevalence, with an expansive accentuation on ubiquity accomplished from more-trusted and powerful sources.
Legitimate Inbound Links to Page
Getting connections from different destinations that have countless connections to themselves are worth significantly more than those without. The same is valid for their inbound connections, in deciding the estimation of their connection to you, et cetera. Along these lines, connections are similar to coin, with speculative quality extending from $0 to $1,000,000. This a component of the PageRank algorithm.
Reference(s): Larry Page
More Inbound Links to Page
Did you truly require us to let you know that? A greater number of connections are worth more than less connections of equivalent quality. Obviously, this variable doesn’t exist in a vacuum; quality can trump endless amount and numerous backlinks are truly useless. Yet, as a further capacity of the PageRank calculation, your site commonly needs numerous connections to be focused in pursuit.
Reference(s): Larry Page
Authority Inbound Links to Domain
PageRank got from connections from outside locales are appropriated all through a space as inside PageRank. Space names tend to pick up power all in all: substance distributed on a power site will immediately rank far higher than substance distributed on an area with no genuine power.
Reference(s): Larry Page
More Inbound Links to Domain
At the end of the day, more connections to a space of equivalent quality will expand the general power of that area name. In Larry Page’s unique examination paper on the idea of PageRank, he portrays ‘hostname based grouping’ as a part of PageRank.
Reference(s): Larry Page
Link Stability
Backlinks seem to pick up quality as they age. Theoretically, this may be on the grounds that spam connections get directed and paid connection plots inevitably lapse. Subsequently, backlinks existing for more periods are worth more. This is likewise upheld by a patent.
Reference(s): Patent US 8549014 B2
Keyword Anchor Text
The grapple content utilized as a part of an outside connection will help build up significance of a page towards an inquiry term. The objective page does not have to contain this term to rank (see: Google Bombing).
Reference(s): Patent US 8738643 B1
Links from Relevant Sites
Links from locales that cover comparable material to yours are normal. In opposition to well known misguided judgment and various profoundly dangerous external link establishment/unbuilding plans, not every connection to your site needs to originate from a space that is just devoted to a subject. This would seem exceptionally unnatural. In any case, so would never being a piece of industry particular examinations. This is a Hilltop’s element calculation algorithm.
Reference(s): Krishna Bharat
Keyword in ALT Tеxt
Cаtсhрhrаѕеѕ utilizеd аѕ a раrt оf the ALT рrореrtу of a picture аrе dealt with аѕ ѕtау соntеnt. Shоrt, trulу engaging ALT lаbеlѕ additionally enhance general ореnnеѕѕ аnd hаvе an еxсееdinglу solid effect on рiсturеѕ showing uр in-ассоrdаnсе with рurѕuitѕ frоm Gооglе Imаgе Search. Reference(s): Patent US 8738643 B1, Mаtt Cuttѕ
Context Surrounding Link
For a long while, it’s been set up that the content encompassing a connection, notwithstanding the stay content inside, is considered in assessing setting. Support for this hypothesis is strengthened by a patent and straightforward experimentation. Along these lines, joins in content are liable to give more esteem than a stand-alone connection that is separated from setting.
Reference(s): Patent US 8577893, SEO By The Sea
Query Deserves Freshness (QDF)
Google doesn’t rank each inquiry question the same way. Certain pursuit questions, particularly those that are news-related, are particularly delicate to the freshness of substance that they will distribute (and might just rank substance that is later). Google’s expression for this is Query Deserves Freshness (QDF).
Reference(s): Matt Cutts, Amit Singhal
Safe Search
In specific circumstances where grown-up substance may be included, a site could conceivably rank construct totally with respect to regardless of whether Safe Search is empowered in Google’s settings. As a matter of course, Safe Search is turned on.
Reference(s): Google
Negative Off-Page Factors
Negative Off-Page Factors are by and large identified with unnatural examples of backlinks to your site, as a rule because of purposeful connection spam. Until the Penguin calculation was presented in 2012, the aftereffect of these components was quite often a debasement, as opposed to a punishment. That is, you could lose all, or about all, quality acquired from connecting practices that Google felt may be unnatural, yet your site would not be hurt generally. While that is still generally genuine, Penguin presented off-page punishments in various cases, which has opened the conduits for vindictive conduct from contending destinations as a practice known as negative SEO or Google Bowling.
Excessive Cross-Site Linking
At the point when owning various destinations, it’s debilitated to between connection them with the end goal of blowing up your inbound connection power. Danger increments with the quantity of between connected areas. Regular proprietorship may be recognized by area registrant, IP address, likeness of substance, similitude of configuration, and once in a while, distinguished and punished as a component of a manual activity. Exemption made for Internationalization or “when there’s a better than average reason, for clients, to do it”.
Reference(s): Matt Cutts
Negative SEO (Google Bowling)
Negative SEO, truly named “Google Bowling”, is the demonstration of a noxious linkspam led in the interest of your site by an outsider. This was once extremely troublesome, since we lived in a universe of off-page downgrades, instead of off-page punishments. On the off chance that a downgrading were to happen, a contender could just misrepresent existing plans, making worth be lost sooner or all the more without a doubt. On the off chance that off-page punishments exist, which they do, negative SEO is demonstrated by rationale alone.
Reference(s): Matt Cutts
Paid Link Schemes
Joins can’t be bought specifically from a site proprietor for the sole motivation behind passing PageRank. Matt Cutts states that this is straightforwardly enlivened by the FTC’s rules on paid supports. To expression this another way, backlinks saw as supports, and bona fide supports should happen without direct remuneration.
Reference(s): Google, Matt Cutts
Diluted Page Authority
As a component of the PageRank calculation, each connection on a page separates the general power that is gone to the pages that are connected. For instance, one page with one connection may pass a theoretical PageRank estimation of 1.0, though an indistinguishable page with 1,000 outbound connections would pass 0.001.
Reference (s): Matt Cutts, Larry Page
Diluted Domain Authority
For almost the same reason that weakened page power is conceivable, it’s workable for a whole area to weaken outbound PageRank. Consequently, destinations that are more selective about who they connection to, in respect to who connections to them, are profitable, while locales working as complete free-for-all connection homesteads have a worth close to zero.
Reference(s): Matt Cutts, Larry Page
Manual Action
Notwithstanding every other positioning variable, Google’s webspam group will in any case infrequently make manual move against specific destinations which can take a large portion of a year to a year to recoup from after you’ve tidied up the issues. Frequently, these punishments accompany a warning in Google Webmaster Tools. Thus, it’s basic to continually look past the usefulness of today and ask “what does Google need?” Learn Google’s methods of insight and business your site in amicability.
Reference(s): Matt Cutts
Crawl Rate Modification
Google Webmaster Tools permits you to adjust the rate in which your webpage is crept by Google. It’s not by any means conceivable to accelerate Googlebot, yet it’s absolutely conceivable to back it off to zero.
Reference(s): Google
While the term “pagerank” has somewhat taken on a new meaning since the evolution of the Google algorithm via the Penguin and Panda (and a few more obscure) search engine updates, it’s still a valid term in the sense of where your website will rank in keyword searches. No longer is it a true indicator of an authority website as easily as it once was.
Regardless, these are the fact-based search engine result pages (SERP) ranking factors that will determine where your website is going to land when your primary keywords are searched in Google. Take it as you wish, but we chose to cut out all the hearsay and conjecture. This is a comprehensive verifiable guide of the Google ranking factors from solid factual Google members, patent filings and applying the scientific methods.
Interesting stuff – half of which I’ve never put into my own content or websites LOL. Do you maybe feel that SEO is starting to take people for a walk around the block? I’ve been running websites for over seven years and the same old tried and tested method of keywords seems to give fantastic rankings! Title, first paragraph and last paragraph ( also throw the keyword in the first image alt tag ). I just feel that a lot of SEO pointers are put out there to actually finger the marketers in the first place! They let Google know that you are trying to rank – actually working against you? Just a thought.
LOL! Excellent points Chris — and yes, to a degree, I do! 🙂
Thanks for stopping by!
Pj, this is a masterpiece of an article and like your references. I agree that there is so much speculation in the internet about what the SEO ranking factors are and it is actually so simple. In the end we must realize that there are only a few things that we can do to tell Google that our content is good quality.
The other things that we have less control over is the behavior of the readers on our website and how they interact with it. In the end Google does not care what we think about our website and it cares about what the readers are thinking.
Thanks very much Viljoen, I couldn’t agree more. We have to play by there rules as they are the largest of the search engines around.
Hi there Pj,
Great write up about ranking on Google. I have a few questions about this topic in relation to old blog posts. I took the time to revisit some old content and found that they are no longer ranking.
As I dig into it, I realized that I have started out with very poor keywords selection – too competitive and make no human sense. I want to revive these posts but I am concern about a few things.
1) Should I do a 301 redirect using better keywords in the same topic? I heard that traffic can be negatively affected by this process.
2) Or can I create new posts, targeting the same topic with the possibility of duplicating some content from the older posts but using better keywords?
3) Should I just move on with creating new content and let the old content fades away?
Thank you for your advice.
Hi Cathy,
IF doable, I would create a totally new and unique post containing the keywords you prefer using over the keywords of the old post. If that’s not doable, I would simply modify and improve the old post, allowing the new one to be indexed and removing the old one from the Google index. See How to Fix Google Sitemap Errors. Great questions!
pj
I have been working on my website for almost 9 months now and I am still trying to learn about google ranking factors.
Your article was a bit over my head and makes me really wonder if I am doing things right. There is so much that I don’t know and still have to learn.
I will re-read your article as I think it is filled with valuable information.
Thanks for sharing!
I totally understand Simone! This is for advanced users for sure and to simply dispel some of the hearsay and myths about the real Google Ranking factors. For beginners, I would suggest the less advanced On-Page SEO Checklist. Thanks for stopping by!!
pj
Hi Pj! That’s an amazing article! I it great to have everything we need packed in one post so we cannot lose track of what is important I have bookmarked it!
I have one question for you… I always hear good things about natural links. I have to take some time to dedicate myself to it…The thing is, my website is relatively new (5 months), and the time spent on natural links, will be time I could spend creating more content. So, what should I do in this case?? More content, or more backlinks? Which is more important? I know both are, but what if I focus only on content for now?
Many thanks
All the best
Stefan
Hello Stefan!
Very good question. When I first began I would stress over this very same dilemma. Then, an online six-figure earner told me about the 80-20 Rule. Well, basically, it’s a guideline. Spend 80% of your time creating high quality, engaging and unique content (media-rich and something other would like to share – content marketing) and the other 20% promoting your website. Website promotion can be natural links, guest posting or social media marketing. This is mainly how I split my time. Some days, it’s 80-20, but on others it’s 70-30. I feel this guideline can fluctuate a bit depending on how much content you already have.
Thanks for the great question!
Pj
Dear PJGermain,
Amazing Blog Post you have written. There are many things I am unaware of in order to get better rankings for my Posts these days.
What are space names?
When you are implying text in images, does that mean that the first image on your Blog Post that has a Keyword in the Alt Text needs to have that same exact Keyword appear in for Caption also for better rankings?
In terms of Tags, how many is enough? Can I use 3-5 max targeting low-competition keywords related to my Blog Post topic?
Your website is laid out in a nice orderly fashion, most impressive, easily navigational, and is very eye catching too.
Wishing you all the best with your online success above and beyond the horizon,
~Angel
Hello Angel!
When I referred to “space names” that is really short for Domain Name Space or simply Domain Names. Thanks for pointing that out, it makes that paragraph a little confusing. I’ll edit accordingly.
Yes, with text in images, I am referring to the ALT text we assign to each image using keywords or LSI keywords. And, yes, if you use captions, I would include a variation of them within the caption also.
The number of tags can vary, 3 to 5 is a good range indeed. Well done.
Thanks very much for your comments! Stop by anytime!
Cheers,
Pj