Tuesday 29 December 2009

SEO Basic Steps - Part 3

Now Christmas is out of the way I can crack on and get this series of articles finished. In this section I am going to write about Generate Required Files for the search engines and the use of trackers and how to install them.

Required Files

There are 2 main files I generate for each site. A robot.txt file and also a sitemap file. If you are using a blogging platform, these files are often created for you which saves time. For those of you optimizing your own content then it is beneficial to create your own files. There are simple ways to do this for each file.
  • robots.txt - This file aims to tell the spiders where they can and can't go. If you have a directory full of confirmation pages which you do not want the spiders to index then you can block them using this file. It is a simple text file which contains Allow:/ and disallow:/. A good generator is listed at the bottom of this page.
  • Sitemap - This is a list of pages on your site in an XML format. It allows easy indexing and finding of your pages by the spiders. Remember to update this if you add pages to your site.

Both of these files sit in the root of your website, you can view the sitemap of pages by adding /sitemap.xml to most url's which will display a sitemap.

Trackers

Since you are going to want to know what traffic is going to your site a tracker is needed. This can come from google's analytics service which is an excellent resource to find out who is visiting your site and from where.

The site asks you to place a tracking code just above the close body tag of your website which calls some javascript to post your information to google so they can generate information. Now, this will not increase your rankings but it will allow you to see if what you are changing is performing how you expect.

That's it for another article. I will add part 4 soon which should be the end of the whole process.

Links

Friday 11 December 2009

SEO Basic Steps - Part 2

Continuing from Part 1 looking at search engine optimization strategies (SEO) we are going to expand on some of the other points. Last time we discussed keyword selection and content creation. This time we are going to look at meta data and checking the files for the important tags.

Add Meta Data

Now, this step used to be the only way to get your page ranked but it was open to abuse! If you put an entire dictionary in the meta data, your site used to show up no matter what. This is partly the reason that most search engines ignore meta data as much as they can.

Having said that, some don't. So you still have to add the data in order to allow for these older engines.

I have a template I use in order to produce my meta data. It consists of the keywords section, descriptions, robot instructions (although the google spiders ignore them) and a recent addition is geo-tag information. If you use this template, just fill in the blanks and it will help you in your rankings:





<meta name="author" content="" />
<meta name="description" content="" />
<meta name="keywords" content="" />
<meta name="robots" content="index,follow" />
<meta name="Googlebot" content="index,follow" />
<meta name="revisit-after" content="1 day" />
<meta name="distribution" content="Global" />
<meta name="DC.Title" content="" />
<meta name="DC.Description" content="" />
<meta name="DC.Date" content="2008-10-20" />
<meta name="DC.Type" content="Interactive Resource" />
<meta name="DC.Format" content="HTML" />
<meta name="DC.Identifier" content="" />
<meta name="DC.Language" content="en-gb" />
<meta name="DC.Coverage" content="Global" />
<meta name="DC.Rights" content="" />
<meta name="DC.Creator" content="" />
<meta http-equiv="content-language" content="en-gb" />
<meta http-equiv="pragma" content="no-cache" />


Adding this code and filling in the blanks will cover the basics of any meta data required. Add the geo-tag information to the meta data and this will help the meta data.

Basic HTML Check

There are a few basic commands which are used by most spiders to find items on your site. It is important to keep these items on the pages or they will not rank well. The things to look for are:

  • H1 - Heading tags or H1 are used to say what your site is. The H1 being the main heading should be located at the top of the page so it is easy to find.
  • Alt Tags - A handy way to get more keywords in, these are used to describe your pictures. Make sure they actually do describe the picture or they will be of no use.
  • Title - In the head section of the page, there should be a title. I generally put the site name then a comma and then some of the keywords for the site, but keep them relevant to the content!
  • Meta Tags - As above, keep them relevant to the site though,

These are the main areas, it is also worth noting that div tags are generally the preferred method of creating the layout of the page. Tables do work but they are slower to load and you do not have as much control as you do with divs.

That's it for part 2, part 3 will be coming soon.

Wednesday 9 December 2009

SEO Basic Steps - Part 1

When I am looking at a website in order to update it's SEO or starting a site from scratch I have spent ages looking around for the a, b, c and you will get traffic model and it simply doesn't exist. So I had to come up with my own formula to optimize websites in order to make sure I do not miss anything.

This formula (or list of steps) breaks down to 10 stages which I follow in order to check the content. These are:

  1. Select Keywords
  2. Write Content
  3. Add Meta Data
  4. Check & Alter Page HTML Tags
  5. Generate Required Files
  6. Install Trackers
  7. Submit to Search Engines
  8. Submit To Directories
  9. Purchase Additional Domains
  10. Wait For Results

Within the next few articles I will explain how each step will affect your page ranking and what would happen if it was missed.

Select Keywords

There is no simple way of doing this, if you know your product then it is simple. I often start by writing down a list of 5 keywords that people would type in a search engine and the results would show. Doing this gives you a basic set of words for your articles, but remember, in this list avoid stop words (such as and, the it, i etc) because they do not help in the next step.

Because this is such a low number we need to expand this list and this is where keyword analysis tools come in handy such as the google keyword analysis tool.

Using this tool, enter each of your keywords individually and look at what is returned. The aim is to select some phrases which will have a higher search rate than the current ones you have selected.

At this point it is important to note that there is no point in just selecting single words, the best bet is too select some with 3 words, some with 2 and some with a single word in order to generate a decent list of keywords. Save this list somewhere safe, we will need it in a later section.

Write Content

Unfortunately, there is no substitute for well written and meaning full content so you can't skip this step or get a machine to do it for you. The best bet is to take your original list of keywords and use this to base your article on. Try to get at least 2 keyword phrases in the first paragraph will help the rankings for those keywords. This is because when you begin to read a document, you look at the first few paragraphs to see if its worth continuing, the spiders appear to do the same thing. They look at the first sections too see if the content is worth looking at or ignoring as well as continuing to the rest of the document. If you miss this step, you will end up with the first few paragraphs full of complete rubbish and the site won't index well.

I will expand on this article on the next steps and link them too here but that's it for now.

Tuesday 8 December 2009

Google's "Real Time Search"

Today Google has started to launch its real time search results. These are scrolling results displaying the latest information using some sort of AJAX postback so they are automatically updated. The results are appearing in the middle of the serps and there quite interesting to watch.

I am not sure how google selects which results to show and I am sure these facts will come out within time but to actually see real time information is an amazing achievement.

If you can't see the real time news then it might be worth checking out one of Google's other latest tools, google trends. I blogged about this a few weeks back and it allows you to compare items to what was happening at a point in time. Although, its now gone live and shows what people are currently looking for on the internet.

Select a keyword from the trends site (see links) and type it into google. You should be able to view the latest results in a pannel about half way down the rankings. Apparently, to get them to show if you add &esrch=RTSearch to the end of your results they will appear. I've not tried this though.

I hope there is some sort of checking to see the relevance of posts on this system else it is open to abuse from the black hat SEO community pushing sites which have nothing to do with the content just to get a link.

Anyways, its worth checking this out, its quite a landmark in search engine terms (well, for me it is anyway)

Links

www.google.co.uk/trends

Wednesday 2 December 2009

Do Web Standards Help SEO?

I myself have often searched around for a set of do x, y, z and it will improve your rankings but with no concrete results. I've come to the conclusion that good quality SEO begins as soon as you start that first HTML tag. By this I mean that if you follow the standards, your site will always rank well, but what are my reasons for this sweeping statement?


Well, when I write code to import or manipulate data (be it in .Net or some other language), it has to be in a certain fassion or else it will not get split how I want it. One characeter in a possition it was not expecting too be in can cause many issues and throw off the complete algorithum and the remaining data, or in an extreame case cause an error. So it makes sense that in order to index a page, the easier it is for the spider to crawl the site, the better.


Since the spiders are effectivly taking the content and splitting it down into sections so it can pull out what items it needs to read and what it doesn't then the more rigid a document the better. Taking the CSS and XHTML standards seriously gives the spiders a guide of what to look for and where they expect it to be so in theory should index a site well.


Or would it? Since the spiders appear to only pull out the elements they require such as the hyperlink tags () and then scan for heading tags and paragraph tags then why would the document need to be in the correct mannor? It might be that they don't even bother to look towards standards and these web standards are mainly to help browser developers to display pages consitently.

This is only my option but I would say that sticking to the rules for generating a website (i.e. adhearing to web standards) is a must and can only increase your ranking. But perhaps not because the site is well structured, more it is easier for a person to read and reccomend. So do the standards help? I would have to say yes.

Tuesday 1 December 2009

Directory Submission Services - Are they worth it?

We recently undertook a project to provide SEO to a companies website. The owner was getting around 50 hits a week and he wasn't happy. The site is a telecommunications site which services the North Wales/North West area of the UK and its quite a competitive market so we thought that it would be reasonable to add the site to as many directories as possible for 2 reasons.

1) Provides back links to the site, which as we know, increased the page ranking because google See's the site as a better resource.

2) Business directories are always well used, its free advertising and the more you are on, the more change there is of someone looking at that directory for your company.

Now submitting to thousands of these sites would take myself and my colleagues ages. So long in fact that it would cost the client an absolute fortune. Because of this, we decided to enlist the services of a third party company called Directory Maximizer who, for a minimal fee per listing, will submit your site to these directories. They will even handle the final submission email for you as well so you don't need to worry about this.

In our experience, this saved us a lot of time but the pages were slow to be indexed and linked back to the customer. So far, we have submitted to over 1000 directories but have only got around 52 links back from them so around 5%. It's not a bad return and they will increase over time so that's not an issue. The customers happy as well because we can give them a list of sites which they have been submitted too and they can go and see the results for themselves.

From an SEO point of view, would I use this service again? Possibly, it saves myself lots of time and meant we got the information onto those directories for a minimal cost. Being so cheap as well it's also cost effective to get listed. The 5% return doesn't sound like a vast figure but when you consider it took 10 minutes to sign up to the service it is a fair return.

So, on my next projected, if it's a business user and they want a massive insertion into directories then yes I will use the service, if it's a small projected it might be better to target the ones required to maintain quality.