Ultimate On-Page SEO Guide
*Updated Oct 2021*
Over the last 13 years, this SEO guide has gone through many variations. We first built it for internal training purposes, then we decided to share it with clients to help them better understand what we were doing. After years of updating and tweaking it, we’ve decided to just share it with everyone. Enjoy!
When looking to build a website, determining which CMS (content management system) you are going to use is a critical first step. Sure, you could have someone code your site from scratch, but why waste the effort when you don’t have to (not to mention that’s probably a terrible idea, for lots of reasons)?
There are literally hundreds of CMS options, and it can be hard to determine which one is the best. Unfortunately, not all content management systems are created equal, and some can actually do more harm than good (as an SEO rule of thumb, avoid anything .NET/ASP, ColdFusion, and anything made by Adobe).
As an SEO company, one of the problems we see most often is clients who’ve built their websites on a CMS that does not support SEO best practices, thus making it difficult (if not impossible) to rank for their target keywords. Without further ado, here is a list of features you should insist on in an SEO friendly CMS:
11 Must Have Features
- Static, customizable URLs – You should be able to define both the page name and the directory structure for the entire website, page-by-page (not database driven, unless it can be manually overridden if necessary). Keyword rich and search engine friendly URLs are an SEO must have. If this feature is missing it should be a deal killer.
- Support for custom URL redirects (301s, 302s, etc). – At some point you will change a page name, move it, or change its structure in your site hierarchy. In order to keep the trust and value of any inbound links to that page and to avoid creating a poor user experience, you MUST properly redirect it when you move it using a 301 permanent redirect. Your CMS should allow you to do this one page at a time if needed, or for blocks of pages. You should also be able to directly edit your .htaccess file, if needed.
- Customizable Title tags and Meta tags on Every Page – The Title tag and Meta description tags are important SEO elements, and should be unique for each page. They should always be carefully written and never duplicated from page to page. Meta keywords tags have no value, and should be left blank. These tags should not be database driven, but written manually and thoughtfully for each page.
- Custom Image File names and Image ALT Tags for Each Image – Search engines look at image file names and ALT tags for keyword usage to help define the topic of a page and the site as a whole. You need to be able to choose the image file name, and you should have the ability to define an ALT tag and if needed an image Title tag.
- Support for Rel=Canonical tags and Meta Robots tags on Each Page – These elements help to prevent duplicate content penalties, especially with eCommerce websites. You should be able to add this tag on a page-by-page basis as needed.
- The Ability to Directly Edit the HTML on ANY Page – This is important for customizing content, link anchor text, NoFollow tags, heading tags and other HTML elements.
- Automatically Generated and Updated Sitemap.xml File – Search engines use this file to find and index all of the pages on your site. It can be a real pain to maintain manually, so the automatic feature is extremely handy. If you have videos on your site, you should also have a separate video sitemap.
- Support for Blog Integration – Every website needs a blog, since that is the best way to ensure ongoing content growth (which search engines love). Your CMS should support having a blog in a sub-folder.
- Clean, Lean Coding – Your CMS should make use of HTML best practices, using clean, current code (validates 100% with W3C), and that loads quickly. Valid, fast loading code is appealing both to visitors and search engines. Avoid code filled with useless or excessive JavaScript, unnecessary non-breaking spaces and antiquated tables instead of clean CSS. If possible, find a CMS that supports HTML5/CSS3. For images, use compression, and for layout elements consider using CSS sprites. Make sure to set page caching quite a ways out.
- Rich Snippet Support – Schema and RDFa, for example, but other rich text markup such as reviews, products, apps, locations and services are also a good idea if relevant.
- Mobile Support – This could be responsive design, where the site auto-adjusts to any screen size or device type, or it could be a dedicated mobile version of a site. Whatever option you go with, at this point in time not having a mobile friendly version of your site is a major negative factor. At this point, support for AMP (accelerated mobile pages) would also be advisable.
5 Potentially Nice to Have Features
- Social Sharing – If you want people to share what they read on Facebook or Twitter or any other social site, you need the ability to integrate buttons to make that sharing easy. You might also want things like open graph tags, Twitter cards, Facebook insights, etc.
- Social Comments – If you want to expand the reach of your content, using something like Facebook Comments or Disqus to power your commenting system can be super useful. That said, Disqus in particular creates a crap ton of links on the pages they are on, all pointing off your domain. They are NoFollow links, but that doesn’t mean much (NoFollow links still burn off some link juice on the page). Weigh the pros and cons of social commenting carefully.
- Custom 404 Page – Hopefully your site will never have any 404 errors, but if it does it’s nice to have something show up other than “404 error – page not found”. Create a custom 404 page that lists your most popular pages and perhaps a search box. Keep them on the site! Consider making your 404 page funny, like these: http://fab404.com/
- Automatic Link Management – Whenever you change the name of a page, any links to that page break. While implementing a 301 redirect for the changed page will fix this, you would ideally want to go through the site and change all links to reflect the new page name. Doing this manually sucks, so find a CMS that supports automatic link updating if possible.
- User Generated Content – Make it possible for visitors/customers to rate products or services, leave reviews, etc. For some industries this is a must have, not a nice to have.
WordPress (.org) is, by far, the most SEO friendly CMS available (with some minor modifications), and it’s a good fit for the vast majority of website needs. If you need eCommerce capabilities, WordPress offers those too. From Cart66 and WooCommerce to custom Magento installations, there are plenty of good options on this front.
This isn’t a comprehensive list of features by any means, but any CMS with these elements will be about as SEO friendly as it can get. Beyond this list, look for additional features as dictated by your company needs.
A site that loads slowly, that is down often, or that shares IP addresses and/or hosting servers with less than savory sites (shared hosting) can negatively impact that user experience. Beyond that, being on a shared server can pose major security risks for your website, depending on how well your hosting provider has things configured.
To avoid any such problems, it is recommended that you host your website on a dedicated server that has the capacity to handle your site features and traffic (with capacity to spare) to avoid any downtime. Having secondary and tertiary servers as redundancies is also highly recommended (cloud platforms can help with this). Though certainly not required, you may also find some benefit in using a server located in the country that matches your TLD (i.e. USA for .com, .net, UK for .co.uk, etc.).
It is also highly recommended that you have a dedicated C class IP address for your website, and that you don’t share that IP address with ANY other websites. This avoids any potential cross-linking devaluation issues, as well as preventing bad neighborhood issues. A bad neighborhood occurs when a number of sites on a shared IP address are identified by Google as spammy, thus hindering the rank-ability of any other site on the same IP address.
Make sure to schedule regular backups of your website and all databases at non-peak traffic times (2am is usually a great time). Also, make sure to regularly test the page load speed on your site. If you’re running WordPress, the best option available is probably WP Engine.
In 2021, Google launched the Page Experience Update, and officially made Core Web Vitals a ranking factor.
For usability reasons, best practices dictate that a web page should load within 1-2 seconds on a typical connection. However, according to Google a load time of 1.4 seconds seems to be the threshold between a fast page and a slow page. That means, ideally, that every page on your website should load in 1.4 seconds or less (to the Load Event; when the page is technically complete), to receive the maximum SEO and usability benefit for fast loading pages.
Google gathers page load time data through actual user experience data collected with the Google search toolbar, Google Chrome, Google Analytics, and may also be combining that with data collected as Google crawls a website and from other sources as well. As such, page load speed in terms of the ranking algorithm is likely being measured using the total load time for a page, exactly as a user would experience it, and not just user perceived load time (visual load time). Though Google has claimed that only TTFB (time to first byte) is factored, we don’t buy that, because it isn’t a UX focused metric.
These days, they look at First Input Delay, Largest Contentful Paint, and Cumulative Layout Shift (the 3 Core Web Vitals).
One of the best resources for tips and tricks for lowering page load speed is http://developer.yahoo.com/performance/rules.html, and one of the best tools for testing your site is http://tools.pingdom.com/fpt/ (unless you have Google Analytics configured, in which case that is the best possible source of page load speed data). GTMetrix is also great.
There are a few key things you can do to increase that page load speed to reach Google’s recommended page load speed of 1.4 seconds or less. Three things that would be most impactful would be properly leveraging browser caching, using CSS Sprites for design/layout images where possible, and reducing the image file sizes as much as possible for images that can’t be sprited (different file types, removing unnecessary color channels, etc.). You might also see benefits by using a content delivery network (CDN) for your images.
We would also recommend reducing the total number of CSS and JavaScript files by combining them into fewer files, and minimizing the file sizes by using compression and code minification where feasible. Tools like WPMU Hummingbird can help a ton with this.
W3 Total Cache is another excellent WordPress plug-in that can help with page load speed issues, and a simple CDN can be set-up via Amazon AWS for very little money. You can learn how to do this here.
Mobile usability signals fall roughly into two buckets, UI (User Interface) and UX (User Experience). The first, UI, means having a mobile friendly site (preferably a responsive site), with appropriately sized tap targets, a well thought out mobile navigation, auto-scaling images, and an easily readable font, font size, and color palate. Google has a great tool here to help evaluate this. Varvy is also helpful.
UX means having a fast loading mobile site, not showing full page mobile pop-ups, minimizing ads (especially ads with video or audio), and anything else that could cause a user to feel like they’re getting a bad experience on your site. Google has another great tool here to measure some aspects of this.
This is worth calling out again: DO NOT show shitty pop-ups on mobile devices – Google is demoting sites for this.
Mobile page speed is of particular note, because it’s the single biggest factor in whether someone stays on your site or leaves when visiting from a mobile device. One relatively easy way to boost your mobile speed is to implement AMP pages, which Google is showing a lot of favoritism towards lately. There are however many pros and cons, so don’t just dive headfirst into AMP without doing research.
Because there are so many programming languages and so many ways to accomplish any one thing using each language search engines rely on certain rules in which they read the content of the website. Having code that adheres to these rules removes and helps to minimize errors when parsing or separating the code from the content of any one page.
Search engines such as Google have openly stated that W3C standards are what they suggest when making the code easy to understand for them. We typically only test the home page of the website, because many issues can be easily fixed a crossed the entire website using just its page templates.
At the VERY least, make sure you heavily test your site for cross-browser compatibility. Make sure you support the browsers and versions that reflect ~95% of your target demographic.
The tip of each pyramid represents a top level page, such as Home Page, About Us, Products, Services, etc. Each of these top level pages should be optimized for a root keyword with significant search volume. The middle and base of the pyramids represent sub-pages, pages under the top-level pages that are topically related. These should be optimized for variations of the top-level or root keyword.
For a website about, say, Bicycle Repair, you might optimize the pages as follows:
- Home Page – “Bicycle Repair City”
- About Us – “Bicycle Repair Shop in City”
- Services – “City Bicycle Repair Services”
The root keyword, the main topic of the entire site, would be Bicycle Repair [City]. As such, the home page is optimized for Bicycle Repair, and all other main pages for variations of that keyword.
Then, you go one more level down. Since Services is optimized for Bicycle Repair Services, pages under that page should be optimized for things like Cheap Bicycle Repair Services, San Francisco Bicycle Repair Services, Bicycle Repair Services in Utah, etc. The higher up a page is within the site hierarchy, the more search volume the keyword assigned to that page should have. The lower the page is in the hierarchy, the lower the search volume of the keyword phrase.
This is called creating keyword silos, and it is a critical component of SEO and user friendly site design. Not only does this make your site topically clear to search engines, but it makes site navigation super simple for users as well. For optimal search engine and user usability, don’t go more than 3-4 levels deep (i.e. Top Level Page, Sub-Page, Sub-Sub-Page). A user or search engine should be able to get to any page on your site in 4 clicks or less. This is called a flat architecture.
Your site’s navigational elements will follow this exact same structure. Speaking of navigational elements, make sure that all site navigation consists of HTML and CSS only. Don’t ever use JavaScript or Flash for your navigation, as that can cause both search engine indexing problems and user experience problems. The details are complicated, so just don’t.
Never underestimate the value of long-tail traffic. More and more queries each year are long-tail, and by anticipating questions people might type into search engines and creating matching content, you can get far ahead of your competitors. You can also mine data from your analytics organic keyword data, your Google and Bing Webmaster tools search query data, and other sources like Google Trends, Google’s Keyword Tool, SpyFu, Quora, and Yahoo Answers.
It is extremely important that URLs be readable, user friendly, and that they contain the keyword of the page. It should never be longer than 256 characters (though ideally under 100 characters), and should contain no query parameters, strange number sequences, spaces or symbols. If a number sequence is needed for eCommerce reasons (like a product ID), append the number to the end of the search engine friendly URL, NOT the beginning. (i.e. https://www.awesomesite.com/killer-keyword-url-1234/)
If relevant, a geo-qualifier (such as Seattle WA) should also be included in the URL.
A proper URL will consist of lowercase words separated by dashes/hyphens only (no underscores, since Google combines words separated by underscores, and no uppercase letters).
The URL structure should generally never go more than 3 sub-directories deep, just as the site navigation should never go more than 3 directories deep. The more important the page, the higher up in the directory structure that page should be. The URL structure should function as a bread crumb, telling visitors exactly where they are within the site.
Your site should, at this point, be HTTPS. Google is throwing a not secure warning in Chrome for HTTP sites, and there are plenty of good reasons at this point to be using HTTPS, so put that in place.
At this point, DO NOT use file-type endings (.html, .php, .asp, etc.) These are unnecessary, add length to your URLs, and show that your site is dated.
An ideal URL structure is as follows:
https://www.keywordrichdomain.com/keyword-category/keyword-rich-phrase/
This URL structure is written perfectly for both SEO and search engine indexing. It is short (70 characters), descriptive, keyword rich, and contains no query parameters. The use of a category can help with usability, but it is not always necessary for SEO purposes. In fact, it can be beneficial to have all pages in the root directory for a smaller site.
For your blog, DO NOT use date-based URL structures (/2017/08/), as they tell both Google and users how old your content is, which can have a negative impact on your rankings and CTR for a variety of reasons.
When linking internally, keep in mind that all internal links should use absolute URLs (i.e. http://www.domain.com/page-name/), not relative URLs (i.e. /page-name/).
In addition to linking from within the text of a page, keyword rich anchor text should be used in the main navigation elements. Where space prevents the use of the keyword for the page being linked to in the navigation, it is important to include the title element in the navigation anchor tag, as follows:
<a href=”http://www.domain.com/” title=”Home of the awesome keyword”>Home</a>
The same goes for links outside of your site. When you get a link on a blog, forum or press release, some of those links should include the keyword of the page being linked to in the anchor text. At present, the ratio of anchor text from inbound links that is considered safe is 15-30% (though this number isn’t exact, and can vary wildly from vertical to vertical, so be careful).
Also, the keyword used in the anchor text going to a page should be topically consistent. One should not use “Keyword Topic A” in the anchor text for one link, and then “Keyword Topic B” in the anchor text of another link to the same page (unless Keyword Topic B is a very close variation of Keyword Topic A). A page should have one core topic, so be consistent.
While Google is more than capable of crawling and executing basic JavaScript, they don’t always get things right. To be on the safe side, all links that you want crawled should be in plain old HTML (this includes navigation elements). If you’re considering using something like AJAX on your site, be aware of the risks and how to minimize them. Pushstates tend to be the best option, but pre-render is another option (though no longer Google’s preferred one).
Unfortunately, broken links can also happen due to someone outside of your site linking in incorrectly. While these types of broken links can’t be avoided, they can be easily fixed with a 301 redirect.
To avoid both user and search engine problems, you should routinely check Google Webmaster Tools and Bing Webmaster Tools for crawl errors, and run a tool like XENU Link Sleuth or Screaming Frog on your site to make sure there are no crawlable broken links.
If broken links are found, you need to implement a 301 redirect per the guidelines in the URL Redirect section. You can also use your Google or Bing Webmaster Tools account to check for broken links that have been found on your site.
In addition to broken links (404s), you also need to watch out for a variety of other error types. 403s are fairly common, and mean Google is hitting a page you don’t allow them to hit…in some cases, this is fine, in others, it may not be. Monitor Google Search Console regularly to make sure you know what’s going in.
Last but not least, server errors (5XX errors) can be a big deal, because they tell Google your server is overloaded, which Google interprets as “whoops, we need to crawl this site way less”. You don’t want Google reducing your crawl budget, so make sure you aren’t throwing server errors.
Any page that changes URLs or is deleted needs a 301 redirect to tell search engines and users that the page has moved/is gone. There should never be more than one URL path to a page. You can learn more about redirects here: http://moz.com/learn/seo/redirection
On an Apache server, redirects will be configured via the mod_rewrite module and your .htaccess file. Making these sorts of changes is a very technical task, and can break your website if done incorrectly. Unless you’re very technical, it’s best to leave this one to your web developer.
That said, here are a couple of common issues, and the correct code to use in your .htaccess file to fix them:
## Always include this at the start of your htaccess file ##
Options +FollowSymlinks
RewriteEngine On
## Redirect HTTPS URLs to HTTP URLs ##
RewriteCond %{HTTPS} on
RewriteRule (.*) http://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
## Redirect HTTP URLs to HTTPS URLs ##
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
## Redirect Non-WWW to WWW ##
RewriteCond %{HTTP_HOST} !^www.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]
## Rewrite all MiXed Case uRLs to lower case URL’s and 301 redirect ##
RewriteMap lc int:tolower
RewriteCond %{REQUEST_URI} [A-Z]
RewriteRule (.*) ${lc:$1} [R=301,L]
## Redirect index.htm, index.html or index.php to the trailing slash recursively for the entire site ##
RewriteCond %{THE_REQUEST} /index.html? [NC]
RewriteRule ^(.*/)?index.html?$ /$1 [R=301,L]
RewriteCond %{THE_REQUEST} /index.php? [NC]
RewriteRule ^(.*/)?index.php?$ /$1 [R=301,L]
## Ensure all URLs have a trailing slash ##
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !(.*)/$
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1/ [L,R=301]
## Remove Spammy Query Strings ##
<ifModule mod_rewrite.c>
RewriteCond %{QUERY_STRING} enter|query|strings|here [NC]
RewriteRule .* http://www.%{HTTP_HOST}/$1? [R=301,L]
</ifModule>
## End of htaccess file ##
Obviously, if any of these don’t apply to your site, don’t use them. Test these carefully before you roll them live on a site! (I take no responsibility for broken sites, you’ve been warned).
If your site has HTTPS pages for a good reason, you definitely don’t want to go redirecting them to HTTP pages! For the spammy query strings (say, someone pointed a ?pill query string at your site), just replace the fake text with the real query strings, separated by pipes if there are multiple query strings.
- Aim for 50-55 characters in length, and never shorter than 15 characters*
- Be unique to and descriptive of that page (never use identical or mostly identical title tags on multiple pages on the same site).
- Use the keyword of that page twice if space permits (once at the start, followed by a separator such as a colon and a space, and then once again in a call to action). If the character limit prevents the use of the keyword twice, use it once in a good call to action, with the keyword as close to the beginning of the title tag as possible.
- If relevant, include a geo-qualifier (such as Seattle WA).
- Typically not be used for branding purposes, unless you have a major brand whose name would increase SERP click-through-rates. If you must include your brand name, use it at the end of the Title tag, not at the beginning.
* – Character length is an approximation; Google is actually using a pixel width limit, not a character limit (though Google claims there is no such limit, or perhaps that it is variable). Title tags appear in 20px Arial font by default, with searched for keywords bolded, and Google has a pixel width limit of 580-600 pixels for Titles (generally, it can vary).
You can see if a Title will truncate by doing the following: simply use Excel, set column width to 580px, set columns to wrap text, and font to Arial 15pt/20px. Type in your Title, and bold the main keyword. If the line breaks, your Title tag will truncate. (You can also use this tool to check: Title Length Tool).
55 characters is now considered the safe upper limit, as this Title character limit will avoid truncation 95% of the time.
Proper title tag structure is as follows:
<title>Keyword Phrase: Call to Action Using Keyword Phrase</title>
A colon and a space are used as the separator because it uses the least amount of characters. You can also use – or | (a dash or a pipe).
The Title tag is the first description of the page that search engine users will read, and it is extremely important to both users and search engines that it contain the keyword they are searching for. This will not only help to improve rankings, but can significantly improve the click-through-rate on search engine results pages (SERPs).
Starting in 2021, Google has gotten much more aggressive at changing Title tags on their own, so keep an eye out!
- Be unique and relevant to that page, be written as ad text, and contain a call to action.
- Be no more than 155 characters on desktop (~920 pixels wide), or ~120 characters on mobile (680 pixels wide).
- Contain 1-2 complete sentences, correct grammar and punctuation.
- Use the keyword once or twice (once per sentence, as close to the start of each sentence as possible).
- Include a geo-qualifier (City and/or State), such as “Seattle WA”, only if relevant.
A proper Meta description tag structure would be:
<meta name=”description” content=”Keyword Phrase used in a question? Keyword Phrase used in a good click-inducing call to action.” />
To further encourage the search engines to use the description you provide, add this tag to every page:
<meta name=”robots” content=”noodp, noydir” />
This tells search engines not to use snippets from DMOZ or the Yahoo Directory. Some people think this tag is no longer needed. We disagree, and have seen some pretty wonky descriptions come from not blocking this.
While these tags have no SEO value, misusing them can still have a negative impact on your rank-ability. Keyword stuffed meta keywords tags are still a negative signal, and could negatively impact your ability to rank organically.
To be safe, every page should have an H1 tag, as search engines look to the H1 to help determine the topic of a page. It should be the first thing in the body text of the page, and should appear prominently.
The keyword of a page needs to be used in the H1 tag, and in at least half of the total heading tags on a page. There should never be more than one H1 on a page. H1 tags should never wrap images or logos, only text.
From a usability perspective, paragraphs should never be longer than 5 lines of text, and it is wise to break up a page every 2-3 paragraphs with a sub-heading in the form of an H tag (H2 or H3). Testing has shown that when users are faced with a large block of unbroken text, most either skim over the text or skip it altogether.
We recommend no more than 1 heading tag per 150 words on the page. It is VERY important that the keyword of a page be used in the H1 tag, as close to the beginning of the H1 as possible. Ideally, if a page has at least 300 words of content, there should be at least one additional H tag on each page that contains the keyword, for added SEO value.
<img src=”keyword-rich-image-name.jpg” alt=”Describe Image” />
Image file names should be comprised of words separated by dashes, should be descriptive to both users and search engines, and should accurately describe the image. If relevant, it should also use a keyword relevant to the page/domain. They should abide by the same rules for length and structure as any other URL.
We’ve heard of websites being sued for not being ADA compliant, so keep that in mind.
When pages that Google deems relevant link to other pages, some of that trust and authority flows through that link to the site being linked to. A “followed” link is essentially endorsing the page being linked to. Anchor text also passes through links.
Enter the rel=”nofollow” tag (and the newly introduced rel=”ugc” and rel=”sponsored”). Google introduced this tag to help prevent ranking manipulation through comment spam. While the use of this tag used to prevent ANY trust or anchor text from being passed on, Google has recently made some changes.
Now, when the tag rel=”nofollow” is used in an anchor tag (link), Google will usually just pass somewhere between less and no SEO value through the link. Using this tag is like saying, this page is nice, but we don’t want to endorse it. Of course, Google can ignore you and pass whatever they choose…NoFollow is little better than a suggestion these days.
With that in mind, from an SEO perspective, you would probably only choose to use the rel=”nofollow” tag in the following instances:
- On any user created link within blog or forum comments, author profiles, etc. If it’s likely to be spammed, use NoFollow or UGC.
- On any internal link that links to a page with no SEO value (i.e. login pages, RSS feed pages, etc) – If the page has no searcher value, or if it’s a page you don’t really want showing up in search engines as an entry point, feel free to use the rel=”nofollow” tag on links to that page (or consider not linking to that stuff at all).
- On any affiliate links or links you are otherwise compensated for placing on your site, us NoFollow or Sponsored. Google has penalized many sites for “link selling”, and it’s a stiff penalty, so don’t mess around on this front.
Remember, NoFollow may still pass link juice to the linked to page. It’s a suggestion.
A proper NoFollow tag would be used like this:
<a href=”https://www.domain.com” rel=”nofollow”>Page Name</a>
Keep in mind though, it is perfectly normal to link out to other sites on occasion, and linking out to at least some other sites with followed links is a part of appearing “normal” in Google’s eyes. Don’t link out to external domains from every page, but definitely do so with followed links from at least a few pages, particularly blog posts as relevant. Relevancy is key.
Last but not least, using a NoFollow tag doesn’t pass additional page value through to other links on the page; rather, NoFollow consumes that link’s portion of the page value without passing it anywhere, essentially burning off a portion of a page’s link authority. In general, it’s better to simply not include a link (if you can avoid it) rather than using lots of NoFollow links on a page.
We recommend having roughly 400-600 words of unique text per page at a minimum, but 1000-3000+ word blog posts and epic guides rank amazingly well, so plan accordingly. If you search a keyword phrase, and get a wordcount for the top 3 results, you’ve got a decent ballpark as to the volume of content necessary to serve that query.
This content on a page should contain the exact target keyword at least a few times, but there is no hard number or percentage to aim for. Just write for the user. Having unique, keyword rich, topic focused text on a page can help to improve search engine rankings significantly.
It is also highly beneficial to use variations of the keyword. For example, for “chocolate pudding”, you would want to use “chocolate” and “pudding” somewhere on the same page. It is also highly valuable to use related keywords. In the case of “chocolate pudding”, you might also want to use “creamy”, “Jell-O”, “dessert” and “tasty”. This technique is referred to as LDA (Latent Dirichlet Allocation) or LSI (Latent Semantic Indexing), and helps to establish excellent topical relevancy, and increases your chance of ranking significantly.
Google has gotten much, much better at understanding this stuff with the introduction of BERT, so make sure your writing thoroughly covers the topic with keyword variants.
If it fits, consider using the keyword at least once in a <strong> tag and/or an <em> tag on every page. Maybe it helps, maybe it doesn’t. Can’t hurt though 🙂
That said; at the end of the day make sure you are writing content for your human users, and not just for search engines. If it’s not worthy of commenting on or sharing, it’s probably not worth writing. You can use tools like Answer the Public and SEMrush to find what sorts of questions and search phrases people are using.
To rank and stay ranking in a competitive space and to appeal to Google’s QDF algorithm (query deserves freshness), regular content growth is very important. We recommend that every site have a blog, and that they write in that blog at least once per week (once per day if possible, but only if you can write one high quality post per day). At the end of the day, it’s better to put out one truly epic post per month, than one shitty post per day.
It is also beneficial to update the content of existing pages from time to time. Google can tell how often a site is updated, and takes that into account in rankings. The more often helpful changes are made to a site, and the more often unique content is added, the greater the value of that site to users in the eyes of the search engines.
In addition to content growth, search engines and users love to see content diversity. Text is great, but images, videos, polls, PDFs, and other interactive resources have both user experience and SEO benefits. Videos and images, in addition to helping with SEO, can also drive additional traffic to your site via image search and video search results.
Duplicate content is viewed as a big negative by every major search engine, Google in particular. It can not only hurt rankings, but can prevent a page from ranking, and sometimes result in de-indexing of an entire domain if malicious in nature. Search engines want to see unique content on a site, and in their search results. A search results page with 10 different websites that contain the same content would be a poor results page indeed.
There are two main types of duplicate content:
- Duplicate content within your own domain (such as is often caused by CMS issues, like WordPress tag and author pages). This is a very common problem with e-commerce websites.
- Cross-domain duplicate content, where your site is hosting content that is identical to content used on other websites. This is also a common problem with e-commerce websites, as well as article sites, news sites, and less than scrupulous scraper sites.
Duplicate content is generally defined as any page that is 30% or more the same as content elsewhere on the web. Duplicate content may be found using N-grams, which look for identical sequence of 10 words or more, excluding common usages and stop words. Too many blocks of identical content on a page = duplicate.
If content must be duplicated within your own site, for whatever reason, you have three options:
- make use of the rel=”canonical” tag on duplicate pages pointing to the original source
- use the <meta name=”robots” content=”noindex, follow” />tag on duplicate pages
- block access to duplicate pages in the Robots.txt file
When it comes to cross-domain duplicate content, you can use the above options, but the best practice is to simply not have cross-domain duplicate content anywhere on your site. It is better to be safe than sorry in this regard.
Because Google is putting more emphasis on user experience, page load times are also now a factor in search algorithms. You should ensure that every page on the site will load in 1-2 seconds on a typical high-speed connection. When a user exits the site via the back button, it hurts rankings, and few things elicits the use of the back button more than a slow loading page. With this in mind, make sure the file size of images or video in your content are as small as possible.
Don’t use Flash 😂
Avoid all Black Hat SEO techniques. When trying to rank for a competitive keyword, you may be tempted to try some less than kosher SEO tactics…DON’T! Black Hat SEO is a very big negative, and if discovered could result in your site being removed completely from Google’s search index.
This means:
- No keyword stuffing. This means NONE, not in any way. Don’t hide keywords in DIVs, strange made up Meta tags, or anywhere else on the page or in the code. It is not worth the risk of being de-indexed.
- No disabling of the back button, and no pop-ups/pop-unders stopping you from exiting a site
- No sneaky redirects
- No pulling large blocks of content or entire sites with iFrames (no SEO value)
- No Meta refreshes (unless the refresh time is 0)
- No hidden text (unless you’re using a user accessible collapsing DIV, which is fine)
- No hidden links
- No text of the same or very similar color as the background
- No displaying different content based on user agent (except for URL names, which is OK)
Though advertising can be a great way to make money, it seriously detracts from both the user experience and the SEO-ability of both a website as a whole, as well as individual pages. Ad calls slow down page load times, often create a poor user experience, and in some instances can trigger ranking penalties tied to the Google Panda updates.
You should carefully weigh the pros and cons of using ads, based on the purpose and goals of your website. For some sites, the tradeoff will be worth it.
Here is a visual example of a perfectly optimized page (courtesy of Moz):
Optimal Content Sample:
A sample page optimized for the keyword “Seattle SEO”, ideally written, is as follows (HTML in Orange):
<h1>Seattle SEO</h1>
<p>If you are a business in Seattle with an online presence, then your business could likely benefit from professional Seattle SEO services. What is SEO you say? Quite simply, it is the art of optimizing a website or webpage to be more search engine and user friendly, ideally resulting in improved search engine rankings.</p>
<p>The vast majority of internet users make use of search engines to find what they are looking for. Regardless of whether you own an online business or a brick-and-mortar operation, a website is a must-have to maximize your revenue potential. Unfortunately, just any old website isn’t going to cut it. To get the most benefit from your online presence, making effective use of <a href=”https://www.vudumarketing.com/”>Search Engine Optimization</a> is a necessity.</p>
<p>So, your business is in Seattle, and you want a local company to build you a website and/or do some SEO work…now what? How do you find such a company?</p>
<h2>Finding the Right Seattle SEO Firm</h2>
<p>With dozens of companies offering SEO services in the Seattle area, it can be a real challenge to find the right one for your business. Of course everyone will claim to be the best, but how do you really know if what they are selling is what you should be buying? To that end, we offer the following advice:</p>
<ul><li><strong>Know What Questions to Ask</strong> – It is important to always negotiate from a position of strength. Learn the basics of <a href=”https://www.vudumarketing.com/”>SEO</a>, such as the value of keyword rich URLs and title tags, before you go searching. Even if you just sound like you know what you are talking about, you are much more likely to effectively weed out imposters.</li>
<li><strong>Ask for Examples of Past SEO Work</strong> – From testimonials to actual optimized sites and rankings, ask to see some work they’ve done. Make a note of the examples, and contact those companies if possible. Ask them if they had a positive experience with the SEO company in question, and if they would recommend them.</li>
<li><strong>Determine How They Measure Success</strong> – If their only measure of success is improved rankings, they are probably not the right company for you. SEO, like any form of marketing, is about making more money. To that end, tracking traffic and conversions in addition to rankings is the ideal scenario. If they don’t measure rankings plus some additional metrics, you should probably look elsewhere for a better company.</li></ul>
<h2>Learn More About Seattle SEO</h2>
<p>While there are many other tips that could be offered, the three tips above are all that you should need to find a high quality SEO company, one that can add real value to your bottom line. <a href=”https://www.vudumarketing.com/contact-us/”>Learn More About SEO</a> today!</p>
In summary of the above content:
- It’s about 400 words in length, not counting code, putting it in the ideal 400-600 word minimum range.
- The keyword is used in the H1 tag, and in at least 50% of the other H tags, and the page is effectively broken up by sub-headings (H2 or H3 Tags). There is approximately one H tag per 150 words of content.
- The keyword is used 5 times throughout the page, and also appears in a strong tag (bold) and in a list item.
- Variations of the keyword and other related keywords are used throughout the text, helping to page to conform to the rules of LDA (Latent Dirichlet Allocation).
- The text of the page contains cross-links to other related pages using keyword rich anchor text relevant to the pages being linked to. Every page should cross-link to 1 or 2 other relevant pages on your site.
The page closes with a call to action, which is a critical element of content writing. All content should provide real benefit to the reader, and should encourage the reader to take some sort of further action to secure a conversion.
One of the most common implementation out there has to do with rel=”author” and rel=”publisher”. This is where the image of the person or business who wrote an article or page shows up next to their search result. These are a great way to increase your traffic, and are very simple to set-up. There’s a great slide deck here that walks you through this particular implementation.
If your site provides products, services, recipes, apps, music, videos, or any of dozens of other options, the chances are good that you could use Schema to upgrade your search results!
These page-level code markups allow your search results to stand out from the crowd, and can significantly increase your organic click-through-rates. I’ve seen instances where the #3 or #4 search result actually had a higher CTR than the #1 ranked result, simple because of Schema markup.
While the scope of what can be marked up this way is quite broad, it’s actually fairly easy to implement. Here’s a handy guide from Built Visible to get you started. Once you have your rich snippets code implemented, you can test the page to make sure it was done correctly with this handy Google tool.
- Create and/or Claim Your Local Profiles – You can certainly do this manually on Google, Bing, Yelp and other sites, but the easiest way is to use a tool like Yext or WhiteSpark or Moz Local to quickly and easily do this across all of the most common sites at once. You can also use a service like Loganix to build out citations (off-site mentions of your NAP).
- Claim Your Social Profiles – With Local SEO, citations are key (any place online that has your name, address and phone number exactly as it appears in your local listings). The key with this is to make sure all of your online business listings use the exact same format for your name, address, phone number and website URL (and we mean EXACT…1-800-798-2430 is different from (800) 798-2430…be EXACT). One of the quickest and easiest ways to get a bunch of citations (and to prevent brand squatters) is to claim all of your profiles via a service like KnowEm.
- Create a KML File – A KML file can help to make sure your business shows up accurately on maps.
- List Your Business Info in Schema – Using Schema to identify your business name, location, business hours and other key elements can not only help with local rankings, but can help with getting rich snippets in your search results.
- Show a Map and a Route Planner – Embedding a map and a route planner into your site makes a ton of sense from a usability perspective. If you want local visitors, do this.
- Use Your NAP (Name, Address, Phone) and Local Keywords Throughout Your Site – Throughout your site, in the footer, on your contact page, and in Titles, Metas and page content as relevant. It should be crystal clear to Google and visitors that you service a particular area.
If you’re using WordPress, you can do some of this very simply with the Yoast Local SEO plugin.
First and foremost, make sure you have created social profiles for your company, and tied those back to your website. Every page on your site should prominently include links to your Facebook page and Twitter profile, at the very least. If you have a YouTube channel, a Google+ profile, a Pinterest or Instagram account, or a company LinkedIn profile, definitely link to those as well.
Next, make it easy to share your content, and by easy, we mean so easy your Grandma could do it. There are a number of ways to integrate these functions, but the most popular are:
- ShareThis – A simple social sharing widget.
- Facebook Connect, Facebook Comments, Facemash, Like Buttons and other FB features
- If your site requires a login, you should consider offering these as options: Sign In With Facebook, Sign In With Twitter, Sign In With LinkedIn, or Sign In With Google (and on and on). Which you pick really depends on your audience.
- You should also consider making use of Open Graph tags and Twitter cards, if relevant (definitely configure these for all blog content).
At the very least, make sure it is easy for someone to like/share your content on Facebook, Twitter, Google+ and LinkedIn. Having these elements in place will help you to effectively leverage the SEO value of your current and future social media efforts. For traffic and branding purposes, you may also want to make it easy for content to be shared on Digg and Reddit, and easy to be bookmarked on StumbleUpon and Delicious.
Last but not least, make sure you make use of RSS feeds. Any page that updates regularly, like your blog, should have an RSS feed. The easier you make it for people to follow, digest and share your content, the better.
Of course, keep in mind that this isn’t an exhaustive SEO guide by any means; it’s focused entirely on on-site SEO elements. If you’re looking for more comprehensive SEO training guides, check out our free Intro to SEO course, The Beginners Guide to SEO from Moz, and The Advanced Guide to SEO from QuickSprout.