Archive for the 'Search Engine Optimisation' Category

Solving Website Structural Problems With The Canonical Tag

Long time no blog! I hope you all had a good festive season. I thought I would kick off the new year with a technical post, as Google announced cross-domain support of the Canonical tag last month (worth reading for the explanations of when you might want to use it and how to implement).

You may remember from my earlier post on the canonical tag, that it is a way of telling the search engines the “master” address of a page, when multiple addresses for the same content might exist. Why would you have multiple addresses (URLs) for a page, you might wonder? Well, how about a product list on an e-commerce website with options for ordering the products alphabetically, by price or by manufacturer? It’s likely that the URL will be different in some way for each version of the list, even though its contents are actually the same. That means that a search engine will index all three versions (or possibly six if you have reverse-order options too).

Why the problem? Well, you probably want visitors to see that list in a certain order the first time they visit, let’s say ordered by price, cheapest first. If Google has all six versions of that page in its database, what’s to say it won’t link to your price: descending (i.e. most expensive first) list from its search results? That might make you look expensive and put off potential buyers.

The other issue is link juice – with multiple addresses for the same page, you might have some links to one URL, some to another, all essentially to the same page but for Google, they are different pages. That means the link juice is being split between those different versions of the page. So, using the rel=canonical tag, you can tell Google what the master version of the page is and that therefore, all link juice should be applied to that version and that’s the one that should appear in search results.

This is what it looks like:

<link rel="canonical" href="">

It goes in the <head> section of each version of the page, so in the product list example, your page would contain the above code regardless of what version is being displayed at the time. This would probably be done automatically by your content management system, so that when a different category of products is being displayed, the canonical tag references the correct category/product list, because it’s likely the same page template is used for all categories.

In effect, the canonical tag works like a 301 redirect, but without you having to mess around with server settings. What changed in December is that now, you can make cross-domain (i.e. cross-website) canonical tags, when before, you could only use it within one domain. So, even those of you with problematic servers (for example, you’re on shared Windows hosting without access to IIS Admin), you can now create “301”-style redirects, avoiding duplicate content issues.

As noted by Rand, there is no problem having the canonical tag in the “master” page.

Changes to Google’s First Click Free Policy

You may recall that I wrote last year about Google’s First Click Free policy, allowing online publishers to protect their content whilst allowing Googlebot in to index the content, so that all their lovely keyword-rich content wasn’t hidden behind a “pay-wall” (i.e. password protected for paying users).

With a lot of hoo-hah about blocking Google from news sites, led by Rupert Murdoch (draw your own conclusions about that one…), Google have announced a change of their First Click Free policy, so that webmasters can block users from Google after five visits per day.

For those with paid-for content, this is probably good news, but you can’t help feeling that the site owners need Google more than Google needs them…

New Google Webmaster Tools Labs Features

google_logo_smallGoogle launched a new Labs section of Webmaster Tools today, containing two features. The first is called Fetch as Googlebot, which shows you the page that Google gets when you enter a URL from your website. Quite handy to see what Googlebot sees, particularly HTTP headers. Here’s a screenshot of the tool showing the 301 permanent redirect from the old holding page to my new homepage on the Keyword Examiner site:


The other tool reports any Malware found on your site, but I’m happy to report I can’t give you a screenshot from one of my sites for that! ;)

Keyword Examiner keyword research tool launched at last!

At long last, I’m pleased to announce the launch of my Keyword Examiner tool, which I’ve been trying to get finished for the best part of a year!

The software itself has been working since January, but I haven’t had the time to put the marketing and support elements together – until now. You can see for yourself at the new website here:

In a nutshell, the tool is a huge timesaver when conducting keyword research for organic SEO. It lets you search Google AdWords keyword data, just as you would with their external keyword tool, so that you can identify search phrases that people are actually using. It then runs up to three searches per keyword to see how competitive it’s likely to be if you optimised your page for that keyword, using exact match (“in quotes” searches), intitle (the exact phrase in the title tag) and allinanchor (the exact phrase in link text pointing to a page).

In this way, you can quickly tell whether a phrase is likely to be easy or difficult to optimise for, identifying the “low hanging fruit” as you go. To do this manually takes hours of cutting, pasting and searching, but Keyword Examiner automates the whole process once you’ve selected the keywords you’re interested in. You can even import WordTracker data if you want extra information (requires a WordTracker subscription).

I won’t explain further, you can read about it in detail on the website. If you think you can send some subscribers my way, there’s also a great affiliate programme that pays 40% lifetime commission.

I’d love to hear whether you like the product and from those who subscribe, what you get up to using it. :)

Link Building Seminar in the East Midlands

As I’m sure many of you know, I do a lot of presentations and workshops for the eBusiness Programme here in the East Midlands. Well, in November I’ll be delivering a short presentation on link building advice at a number of venues. The marketing bumpf goes like this:

Top Link Building Advice for Search Engine Optimisation

Breakfast Briefing

Without doubt, the hardest part of search engine optimisation is getting links to your site, but without them, appearing high in the list for competitive keywords is virtually impossible. Obtaining links is a time consuming, ongoing process, but it can’t be ignored if you want your SEO to succeed.

This session provides a wealth of tips on generating powerful links to your website, with the latest thinking on what works and time-saving methods to help you make the most of your link building activity. From Web 2.0 sites to social media and good old fashioned directories, you will learn where on the web the valuable links can be found.

Topics include:

  • Directories that still provide value
  • How to use user-generated content sites (Web 2.0) for valuable links
  • Using social media to generate links
  • Free tools you can use to find websites to get links from

08:30 Registration, breakfast and networking
09:00 Presentation Starts
10:00 Q & A
10:30 Networking and close

Who Should Attend?

This session will be useful to any business looking to improve their search engine optimisation and builds on the information in our Successful and Advanced Search Engine Optimisation workshops.

Event dates:

  • Lincolnshire 04 November 2009 Boston West Golf Club, Boston
  • Derbyshire 05 November 2009 Ringwood Hall, Chesterfield
  • Leicestershire 11 November 2009 National Space Centre, Leicester
  • Nottinghamshire 17 November 2009 The Village Hotel, Nottingham
  • Northamptonshire 24 November 2009 Freemasons Hall & Conference Centre, Northampton

Please note that places are limited and must be pre-booked. You can book online at the eBusiness Programme website here.

For more information or to register your place, please email or contact the eBusiness Programme team on 0845 603 8370.

New Google Features

google_logo_smallGoogle have been busy adding new features – you’ve probably noticed the larger font in the search box for instance.

One thing you should definitely be aware of if you have a Google Maps listing (Local Business Centre) is the new Place Pages, as detailed here. It’s worth having a read of this to understand how it could affect what people see about your business if they find it through a Google Maps search (as often comes at the top of the search results if you search for a business name and/or place). There isn’t that much different from the old Local Business Centre listing, except for the addition of adverts – so you could see your competitors listed alongside your own information! This is what’s happening with some of the ads in the page below:


If you happen to search for a “hot topic”, Google’s list of which is here, you might find the graph of exactly how hot the topic is appearing in the main search results. Details are here, but sadly it only covers the US and Japan at preset.

Finally, you may notice some additional links appearing in search result pages under some website entries. These are designed to take you directly to the section of the page that’s relevant to your search, using in-page (named) anchor tags that the page’s author has included. Why Google are doing this and what it looks like is explained here, whilst what you can do to utilise this as a webmaster is explained here.

Google Ignores the Meta Keywords Tag (In Case You Didn’t Know!)

google_logo_smallJust in case anyone had any lingering doubts that Google uses the meta keywords tag in web pages, they categorically state they don’t here.

You might also be interested in this explanation of how Google handles duplicate content and why it’s not so much a penalty, as a simple outcome of the per-search algorithm, courtesy of Google’s Greg Grothaus.

Google Internet Stats

google_logo_smallI noted on SearchEngineLand that Google has introduced an Internet Statistics landing page, particularly focused on the UK:

There are stats on various topics, almost all internet/media/technology related, although there are some macro-economic stats available.

A variety of sources have been used and you can even submit your own stats. Interesting stuff! :)

SEOMoz 2009 Ranking Factors Report

Just a quick link: the SEOMoz 2009 Ranking Factors report has been published and it’s no surprise to see that external link anchor text, title tags and link popularity are seen as the most important factors by a reasonable concensus of SEO experts.

Go and have a read here.

Dodgy SEO Company Marketing Tactics

A client of mine forwarded an email (let’s be honest, spam) he received from an SEO company in the north-west last week. I thought I’d write about it here, but I’m not going to name names because frankly I can’t be bothered getting hassle from talking to a company who uses these tactics.

The email is quite long, but I shall give you an overview of my main areas of concern (which of course made me quite angry at the time, as it is effectively questioning my services to the client):

  1. “Your website is probably underperforming in the major search engines… I struggled to find you in the first couple of pages of Google…”The email doesn’t state what search terms the sales chap was using. So, um, exactly how does he define “underperforming”? The site is in the top 10 (mostly the top 2) for all the target phrases I agreed with the client, based on solid keyword research and the client’s target market. Very misleading, but to the uninitiated, it sounds very serious.
  2. “I ran a back-link check on your site… Your website has 2 back-links, meaning it’s not very popular.”
    Riiiiight, exactly what did the sales idiot use to get this information? I’m guessing Google’s link: command, which has been broken for longer than I can remember. The only way to know what links to your site Google knows about is to use Webmaster Tools, which this guy can’t have had access to. After that, it’s Yahoo’s Site Explorer, which reports 204 links to the site in question. Way to go, salesboy… You’re scaring my client and wasting my time as I explain the real situation to them. Thanks.
  3. “Right now, your site has only 6 pages indexed by Google, which is quite low. This can be down to many reasons which our service can help resolve.”
    Um, yeah, the main reason is there are only six pages on the website, genius! Now, I agree that more content will generally help with SEO, but for this particular client it isn’t necessary (see above point about being in the top 10 for all target phrases).

As you can tell, I’m not very impressed with this “research” and these guys are frightening people and causing trouble for other reputable SEOs as a result. The email states that they’re prepared to enter into a 12-month contract at £175 per month to promote five phrases, so basically they want £2100 a year to SEO five phrases for your site. I’ll leave you to decide if that’s good value or not, considering they don’t appear to be able to do even basic SEO research properly.