www and non-www domain

To avoid dublicate content penalties you need to fix the www vs non-www canonical issue for Your Domain. This also concentrate your backlinks to one version. With this fix domain.com will automatically use a permanent redirect to www.domain.com.

.htaccess 301 permanent redirection

Choose one of these redirections – there`s no SEO difference.  Copy and paste it to your .htaccess file.

#Force www:
RewriteEngine on
RewriteCond %{HTTP_HOST} ^example.com [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [L,R=301,NC]

Or:

#Force non-www:
RewriteEngine on
RewriteCond %{HTTP_HOST} ^www\.example\.com [NC]
RewriteRule ^(.*)$ http://example.com/$1 [L,R=301]

Checklist for domain redirection setup

  1. Add the preferred rewrite rule to .htaccess which is located in your website’s root folder.
  2. Add and verify both versions in google webmasters tools.
  3. Select your prefered version: Google Webmasters Tools: upper right -> Site settings -> select your preferred domain
  4. Check your WordPress setup: Settings -> general and make sure that you are using your preferred domain there too. We fixed it with the redirection but better be save than sorry in case a plugin uses these values.
  5. Confirm your setup by testing both versions with this redirection checker. One should show: “301 moved permanently” – the other “direct link”

 

This is an important SEO ranking factor and helps searchengines to index your site properly. Without this declaration your site can be seen as a mirror with identical content. Searchengines may rank a page with www. in front higher, and another page is ranked without the www. That leads to a diversification of your domain authority and dilutes your link strength. People will place your links all over the web. They don’t care if it is http:// or http://www. thus you as a webmaster need to take care of it.

It’s up to your personal preference which version you choose but DO choose one. With SEO in mind there`s no difference which version you choose.

Long Tail Keyword Strategy

What are long tail keywords? a primary keyword would be “dog training” the long tail keyword could be “dog training for puppies”. To give you a more technical definition:

Specific, niche search phrases, usually more than 2 words in length, that offer a low competition, low search volume and high searcher intent.

Various ecommerce sites see the 80\20 rule:

80% of sales and traffic come from long tail keyword terms

20% of sales and traffic come from primary keywords

There are two factors which lead to this result:

  1. Long tail keyword terms have less competition thus your website ranks better and gets more visitors.
  2. A long tail keyword term is much more focused on specific content. That means your audiences’ expectations are highly related to your offer.

How to get long tail keyword terms?

Use google’s auto suggest feature by entering a main keyword into the search. Scroll through the suggested search terms and pick something with 4 or more words. In our dog training example we find “dog training for puppies on stairs”. Because google suggests that it means people are actually searching for it. You can check the traffic with the keyword planer from adsense.

Adwords is another easy way to find related long tail keywords. Enter your main topic and let google find matching ideas.

Check your webmaster tools search queries. You may already rank for various topics without knowing it. Regularly go through your search queries and monitor the traffic coming from that term.

With Siri and google now voice search queries getting more popular. It makes sense to use a question in your headline: “How can I train my puppy?”

Conclusion

It`s always a good idea to focus on them. Visitors are highly targeted to your content and improve your site’s engagement factors (lower bounce rate, longer time on site etc.) It´s much easier to rank and improve your rankings for those queries. Better have 100 long tail keywords sending 10 visitors each than 1 main keyword sending 1000 (more or less unrelated) visitors to your site.

Anchor Text

The anchor text is the link text which is used for every link. That can be an image or text or any other html element.

<a href="http://www.yoursite.com">Click Here to visit yoursite.com</a>

In that example ‘Click here to visit yoursite.com‘ is the link’s anchor text.

Searchengines check the anchor text diversification which means which elements\text is used to link to your site.

A natural anchor text is a mix of these types:

  • your sitename: i.e. ‘yoursite.com’ or ‘http://www.yoursite.com’ or ‘your site’
  • unrelated text like: ‘click here’, ‘visit this source’, ‘read more’, ‘(link)’
  • variations of your keywords / related terms
  • images using various alt texts linked to your website

Searchengines keep a close eye on that mix. If you bought a link package (which is not recommended and a waste of money) you may end up with 500 backlinks all pointing to your homepage with the same anchor text. That makes it easy for google to detect that you got your hands on this backlink building process and chances are high that your website gets a penalty due to black-hat link building.

If your site is good people will link to it. You do not have any influence which anchor texts they are using nor which site gets backlinked. From a seo perspective you need to keep that in mind when asking for editorial links or similar. Mix it as much as possible.

Keyword Density

No! Some people say it must be somewhere at 2-3% – some say it`s not a ranking factor. There is no clear statement about exact numbers. To give it a definition: Keyword density is the measurement how often your keyword appears in your content. If the same keyword is repeated in every second sentence you easily have a density of 10-20% which is a clear sign of overoptimisation and keyword stuffing. Don`t spend time on keyword density and focus on creating quality content written for real people. Searchengines take care of the rest by checking related keywords and matching patterns (LSI) to find your article’s topic.

I want to optimize it!

Though it’s overrated to focus on one keyword it sometimes makes sense to check your competitors density to know what works.  But again – don’t spend too much time with that. Write for people not spiders! Build your site around a niche – not a keyword. Google will know what your article is about.

It helps to mentioned your main keyword in the first paragraph to keep the flow between the title, headlines and content. (proximity density).

Anyway! Where can I check keyword density?

If you still want to check it here are a few tools:

Breadcrumbs

Breadcrumbs help your visitors to navigate on your site. Build as html links they refer to your hierarchical site depth by showing assigned categories/tags/parent pages.

breadcrumbs seo

Breadcrumb HTML example:

<a href="http://www.example.com/dresses">Dresses</a> › 
<a href="http://www.example.com/dresses/real">Real Dresses</a> › 
<a href="http://www.example.com/dresses/real/green">Real Green Dresses</a>

By using css it’s mostly used as a horizontal navigation bar.

To optimize it for searchengines and clearly define it as a breadcrumb navigation add microdata. The links above will then be transformed to this:

<div itemscope itemtype="http://data-vocabulary.org/Breadcrumb">
  <a href="http://www.example.com/dresses" itemprop="url">
    <span itemprop="title">Dresses</span>
  </a> ›
</div>  
<div itemscope itemtype="http://data-vocabulary.org/Breadcrumb">
  <a href="http://www.example.com/dresses/real" itemprop="url">
    <span itemprop="title">Real Dresses</span>
  </a> ›
</div>  
<div itemscope itemtype="http://data-vocabulary.org/Breadcrumb">
  <a href="http://www.example.com/clothes/dresses/real/green" itemprop="url">
    <span itemprop="title">Real Green Dresses</span>
  </a>
</div>

Follow this setup exactly and test it with Google Rich Snippet tool

The breadcrumb will then show up for your search results and enhance your listing.

microdata preview example

Should I wrap the breadcrumbs in h1 tags?

This might be a great idea as the links are highly related to the page. However it is not recommended to do this:

  • headline tags should used for their purpose: headlines. Breadcrumbs are used to navigate.
  • Your page normally already has a h1 headline.
  • It makes more sense to put the last item (here: Real Green Dresses) in bold tags
  • It may be overoptimization
  • If you put the whole breadcrumb into h1 tags google would expect ‘dresses’, ‘real dresses’ and ‘real green dresses’ and if your first item is a homepage link that’ll also dilute your content focus.

Breadcrumbs Checklist

  • Use breadcrumbs to improve usability.
  • Are the breadcrumbs shown on EVERY page?
  • Does it start with a link to your homepage?
  • Use CSS to give it a custom style
  • They are expected to be placed horizontally below your main navigation on the left, above your main content
  • Are you using microdata format and does it validate?
  • Does the hierarchical structure makes sense helping your visitors?

 

Semantic SEO

semantic-seo-sectionsSemantic markup is a html5 definition of a logical structure which helps searchengines to identify the correct page content. To keep this article short and straight to the point we only focus on the most common elements:

<header> <footer> <nav> <section> <article>

  • <header> – the top of your page layout with the logo etc.
  • <nav> – your primary navigation
  • <section> – divide content into sections which is related to the main topic.
  • <article> – your main content is wrapped into that element
  • <footer> – footer links, copyright notices, terms, social etc.
  • <aside> – often used for the sidebar but that’s wrong in most cases. The aside content should be related to the surrounding content.

With WordPress templates you can easily wrap each post excerpt into the <article> element using its own h1 headlines, header and footer elements.

You should also modify your pagination source and add helpful elements:

<a href='/page/3' rel='prev'>Previous Page</a>
<a href='/page/5' rel='next'>Next Page</a>

To save you in-depth research here is a complete site structure using html5 elements

 

Further Reading:
http://www.w3schools.com/html/html5_semantic_elements.asp

http://html5doctor.com/downloads/h5d-sectioning-flowchart.pdf

http://www.smashingmagazine.com/2013/01/18/the-importance-of-sections/

Canonical Urls

The canonical meta tag defines a unique url to avoid dublicate content. Easy to understand with an example:

The content from

yourshop.com/shoes/sports

could possibly also be found through:

yourshop.com/shoes/?search=sports

Searchengines see this as two different urls which show the same content thus penalizing one of them due to dublicate content. To avoid this issue you can solve it with the canonical link relation tag. In some cases this may be a tricky technical problem but it should be taken care of to concentrate link juice.

<link rel="canonical" href="http://www.yourshop.com/shoes/sports"/>

It should be correctly implemented within the <head> tag of the page and point to the version which should rank. The WordPress SEO plugin from yoast takes care of it by adding the canonical url to the header. However problematic queries like the search parameter from the example above require manual work.

Sitemap 101

According to wikipedia’s definition “a sitemap is a list of pages of a website accessible to crawlers or users (…)  organized in hierarchical fashion.” source

Why your website should have a sitemap

Adding a sitemap does not affect search rankings. However by having a sitemap properly included into your site searchengine spiders will find all content on your site and index it.

The sitemap tells the spiders how often the page is updated and help them to crawl your site more efficiently. The XML sitemap is not intended for humans to read so to improve your site’s usability you can also create a custom sitemap template and link it in the footer or show it on 404 error pages. List the latest posts and pages in hierarchical order.

How to make a sitemap

Yoasts Seo plugin automatically generates a sitemap – you can easily include and exclude post types and specific posts. If you don’t use WordPress you can also use this sitemap generator.

Ninja Trick: Submit your RSS Feed

Your RSS feed is also in XML format and can be submitted as a sitemap too. Bing and google encourage you to add your feed too. Their spiders can grab the latest content. The XML sitemap is a complete snapshot of your site and the feed contains the most recent updates. More data for searchengines to keep their index up-to-date.

Sitemap.xml example

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
	<loc>http://www.yourwebsite.com</loc>
	<lastmod>2013-01-01</lastmod>
	<changefreq>weekly</changefreq>
	<priority>0.9</priority>
</url>
<url>
	<loc>http://www.yourwebsite.com/articles/100</loc>
	<changefreq>weekly</changefreq>
</url>
<url>
	<loc>http://www.yourwebsite.com/articles/101</loc>
	<lastmod>2013-01-02</lastmod>
	<changefreq>weekly</changefreq>
</url>
<url>
	<loc>http://www.yourwebsite.com/articles/102</loc>
	<lastmod>2013-01-02T13:00:12+00:00</lastmod>
	<priority>0.5</priority>
</url>
</urlset>

Sitemap Checklist

  • Is your sitemap valid? Test it here
  • Avoid dublicate urls
  • Keep sitemaps and feed to a minimum (best: 1 sitemap + 1 feed)
  • Is the full url added to robots.txt?
  • Have you submitted it to google+bing webmaster tools?
  • Only list your homepage, pages and posts – there`s no need to add tags and categories. Those pages won’t show up new urls as they are already listed via posts\pages
  • Do not list more than 50.000 urls in one sitemap
  • Only include URLs that can be fetched by searchengines and are not blocked by robots.txt or meta robots
  • Only include canonical URLs

 

Robots.txt file

The robots.txt file is uploaded to your website’s root folder. This file guides searchengine spiders by allowing or disallowing crawling of specific files and folders. It´s a URL blocking method and should handled with care.

Example:

User-agent: Googlebot 
Disallow: /folder1/ 
Allow: /folder1/myfile.html
Sitemap: http://www.yoursite.com/sitemap.xml

The user-agent can be a wildcard * so all spiders/bots are affected. User-agent: *

In the example above we disallow the indexing of ‘folder1’ except for one file in that particular folder: ‘myfile.html’

A good robots.txt for a site running on WordPress would be this:

User-agent: *
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /category/*/*
Disallow: */trackback

The WordPress core files are protected from indexing and also category and trackback pages won’t be listed. A lot more can be added to the robots.txt file but this pretty much describes the most important facts. Note: Do not block your /feed/ url as this can be used as a sitemap

You can also try the robots txt file generator

Robots.txt Checklist

  • Add your sitemap url to the robots.txt file: Sitemap: http://www.yoursite.com/sitemap.xml
  • If you’re using WordPress disallow the core folders
  • Is it named properly (case sensitive!) and placed in your root folder?
  • Disallow 301\302 redirections and cloaked urls (i.e. yoursite.com/outgoing/affiliate-offer)  >> Disallow: /outgoing/*
  • If you are using subdomains each subdomain needs its own robots file
  • One rule per line

The file is ready – what’s next?

  • Once you uploaded the file to your website’s root folder you can test it with Google’s robots testing tool

Robots.txt vs. Meta Robots

It`s recommended to exclude specific pages via <meta name=”robots” content=”noindex”> instead of blocking the file with robots.txt. If the url in question gets backlinks from other pages the link juice is lost because robots.txt blocks the spiders. The meta tag still follows links and rewards your page.

If you want to exclude complete folders i.e. /tmp/ /private/ or similar it makes sense to add them to robots.txt

Page Title Tag

Mastering a great page title is a crucial element for copywriters, marketers and SEOs. A well-optimized page title can literally boost your rankings by a few hundred positions, in as little as 24-48 hours! So this is an important step towards a good placement in searchengines.

<head> 
<title>Page title</title>
</head>

Page Title example

Where is the page title tag used?

  • The top browser bar shows the page title
  • Searchengine result pages (SERPS) use the title
  • When your website is shared social networks grab the title for the post headline

How to write a great SEO title

  • Keep it below 70 characters
  • Make each title unique to a specific page – do not repeat it.
  • Put your primary keywords at the beginning
  • It`s your best sales pitch – keep things like curiosity, urgency etc. in mind
  • Do not repeat your keyword in random order
  • Make it a real headline which tells the visitor what your content is about
  • Focus on the actual content – do not mention all things which people may also find on your site
  • Optionally add the matching category or use breadcrumbs in a meaningful way

Should I put my company brand into it?

if it fits into the 70 character limit you can put it at the end. Clearly seperate it with a pipe or dash.  Searchengines will recognize it as a brand name when it’s shown on every page in the same format.

Conclusion

Make it relevant and readable with your main keywords first.  Combine it with a call-to-action and write a unique title for every page laser focused on the content.

 

Further Reading:

http://www.onlinejobreviews.net/notes/ninja-tricks-to-write-killer-headlines/

http://mashable.com/2012/05/08/google-seo-headlines/

Meta Name Keyword

The keyword meta tag is oldschool and was used on every page decades ago. It has been massively abused by spammers who stuffed all kind of keywords into it. Today it’s outdated. Google decided 2010 to completely drop it as a ranking factor. Yahoo indexes the meta keyword but it’s handled as the lowest ranking signal.

Bing says: “Getting it right is a nice perk for us, but won’t rock your world. Abusing meta keywords can hurt you”.

<meta name="keywords" content="seo, search engine optimization">

Should I use meta keywords?

Better save your time and do not use it. If you carefully put relevant keywords into it your site may be seen as overoptimized and it may get a penalty. The 3 big searchengines do not give real value to it. Chances are higher for a negative impact than improvings.

Your competitor could also easily spy on your targeted exact keyword.

Meta Name Robots

The Robots meta tag tells searchengines what to add to their index and where to follow links. It`s different to the robots.txt file but sets similar rules. You can tell searchengines not to look at a page and deindex it from their database so this meta tag has significant influence and should be handled with care.

The meta name ‘robots’ is considered by google, yahoo and bing.

<meta name="robots" content="noindex, follow">
<meta name="robots" content="index, nofollow">
<meta name="robots" content="noindex, nofollow">
  • noindex, follow – The spider will drop that page from their database but follows links on this page. This may be an option for a TOS page or a page where you don’t want visitors to enter from search engine result pages.
  • index, nofollow – The page is listed but links on that page are not followed.
  • noindex, nofollow – Don’t do anything with this page. Do not list it and don’t follow any links.
  • noimageindex – Disallow search engines to spider images from that page. This does not guarantee that google won’t find a link to the image somewhere else.
  • noarchive – searchengines won’t show a cached version of this url
  • NONE – equivalent to “noindex, nofollow”.

There are a few other attributes but index and follow are the most useful. If the meta tag is not set google will use “index, follow” by default.

 

Meta Name Description

The meta description is placed in your page’ssection. Google shows that text in their search results. Always manually write a custom description for each page which clearly describes the content. If no description tag is found google will generate one and that may lead to non-related description.

<meta name="description" content=" ... "/>

meta name description sample

How to write good Meta Descriptions

  • Keep it below 155 characters
  • Clearly describe the content
  • Tag and seperate content: Price: $2.99, Author: JohnDoe
  • Add a call-to-action to make people click on the link
  • Provide a solution or benefit to your audiences’ problem
  • Add WordPress Plugin SEO from Yoast to easily write custom descriptions for each post

The meta description is NOT a ranking factor, though it’s considered for advanced search queries and somehow affects the click through ration which may positively influence your ranking.So where’s the point for the extra work?

Why you should use it

  • With a good Call to Action copywriting your click through ratio will improve ( Click Here for FREE Samples)
  • Social networks like Google+, Facebook, Digg, LinkedIn grab the meta description when your url is shared
  • Give searchers another reason to visit your site – It’s your sales pitch which may win against your competitor

What you should avoid in your description

  • Do not spam your keywords – People actually read that – searchengines will penalize you!
  • Do not write misleading summaries
  • Do not use the same description for different pages – avoid dublication
  • Not providing any related infos about your page’s content
  • Leave it empty – Google will pick something which may be your copyright footer in worst case

 

Check your site’s meta description:

Website Speed Test

An often overlooked factor of a successful site is its performance. Not only google loves fast loading sites. Building a lightning fast website should be one of your top goals when you are serious about SEO. Google, Bing, Yahoo and most important – your visitors care about site speed so make sure you’re not missing this factor. It can take hours to fix all issues and improve the site’s loading time as much as possible.

A fast site will also improve your conversion rates:

  • Amazon found every 100ms of latency cost them 1% in sales.
  • Google found an extra 0.5 seconds in search page generation time dropped traffic by 20%

People don’t have time so build a fast site! The internet will appreciate it.

Website Speed Test

 

Website Speed Check Tools

We cannot guess the site speed so let’s make a website speed test with these online tools – just enter your URL and let them work.

  • Google Page Speed Insights – try to get at least 82\100 for the desktop rating. Mobile should also be above 70\100. The higher the better. Follow the suggestions from that tool and fix your site where possible. Talk to your server admin about response time and setting up a content delivery network
  • Web Page Test / pingdom / LoadImpact – These sites checks many performance factors and how fast your site is delivered. A lot of technical infos but the tab “Performance Review / Page Analysis” should get you going where to start working. In general it checks for CDN availablility, server speed, file compressions and static content.

There are many more sites which are doing website speed tests but with google page speed insights we got some on-page speed optimization factors and the others help us to find server issues. Focus on google insights and if you reach 90\100 you’re good to go.

 

Quick Site Performance Checklist

  1. Add a content delivery network option to your site – files get read simultaneously thus load faster.
  2. Use static content and cache your files as much as possible – check out W3 Total Cache plugin for WordPress
  3. Compress all images via tinypng.com (or tinyjpg.com)
  4. Combine several css files into one file (same with javascript)
  5. Talk to your server admin – optimize htaccess, apache configs, browser caching, gzip compression, server side caching etc.

 

Further Reading:

http://www.carbon60.com/milliseconds-are-money-how-much-performance-matters-in-the-cloud/

http://unbounce.com/conversion-rate-optimization/a-fast-web-site-increases-conversions/

 https://developers.google.com/speed/