Tuesday 29 December 2009

SEO Basic Steps - Part 3

Now Christmas is out of the way I can crack on and get this series of articles finished. In this section I am going to write about Generate Required Files for the search engines and the use of trackers and how to install them.

Required Files

There are 2 main files I generate for each site. A robot.txt file and also a sitemap file. If you are using a blogging platform, these files are often created for you which saves time. For those of you optimizing your own content then it is beneficial to create your own files. There are simple ways to do this for each file.
  • robots.txt - This file aims to tell the spiders where they can and can't go. If you have a directory full of confirmation pages which you do not want the spiders to index then you can block them using this file. It is a simple text file which contains Allow:/ and disallow:/. A good generator is listed at the bottom of this page.
  • Sitemap - This is a list of pages on your site in an XML format. It allows easy indexing and finding of your pages by the spiders. Remember to update this if you add pages to your site.

Both of these files sit in the root of your website, you can view the sitemap of pages by adding /sitemap.xml to most url's which will display a sitemap.

Trackers

Since you are going to want to know what traffic is going to your site a tracker is needed. This can come from google's analytics service which is an excellent resource to find out who is visiting your site and from where.

The site asks you to place a tracking code just above the close body tag of your website which calls some javascript to post your information to google so they can generate information. Now, this will not increase your rankings but it will allow you to see if what you are changing is performing how you expect.

That's it for another article. I will add part 4 soon which should be the end of the whole process.

Links

Friday 11 December 2009

SEO Basic Steps - Part 2

Continuing from Part 1 looking at search engine optimization strategies (SEO) we are going to expand on some of the other points. Last time we discussed keyword selection and content creation. This time we are going to look at meta data and checking the files for the important tags.

Add Meta Data

Now, this step used to be the only way to get your page ranked but it was open to abuse! If you put an entire dictionary in the meta data, your site used to show up no matter what. This is partly the reason that most search engines ignore meta data as much as they can.

Having said that, some don't. So you still have to add the data in order to allow for these older engines.

I have a template I use in order to produce my meta data. It consists of the keywords section, descriptions, robot instructions (although the google spiders ignore them) and a recent addition is geo-tag information. If you use this template, just fill in the blanks and it will help you in your rankings:





<meta name="author" content="" />
<meta name="description" content="" />
<meta name="keywords" content="" />
<meta name="robots" content="index,follow" />
<meta name="Googlebot" content="index,follow" />
<meta name="revisit-after" content="1 day" />
<meta name="distribution" content="Global" />
<meta name="DC.Title" content="" />
<meta name="DC.Description" content="" />
<meta name="DC.Date" content="2008-10-20" />
<meta name="DC.Type" content="Interactive Resource" />
<meta name="DC.Format" content="HTML" />
<meta name="DC.Identifier" content="" />
<meta name="DC.Language" content="en-gb" />
<meta name="DC.Coverage" content="Global" />
<meta name="DC.Rights" content="" />
<meta name="DC.Creator" content="" />
<meta http-equiv="content-language" content="en-gb" />
<meta http-equiv="pragma" content="no-cache" />


Adding this code and filling in the blanks will cover the basics of any meta data required. Add the geo-tag information to the meta data and this will help the meta data.

Basic HTML Check

There are a few basic commands which are used by most spiders to find items on your site. It is important to keep these items on the pages or they will not rank well. The things to look for are:

  • H1 - Heading tags or H1 are used to say what your site is. The H1 being the main heading should be located at the top of the page so it is easy to find.
  • Alt Tags - A handy way to get more keywords in, these are used to describe your pictures. Make sure they actually do describe the picture or they will be of no use.
  • Title - In the head section of the page, there should be a title. I generally put the site name then a comma and then some of the keywords for the site, but keep them relevant to the content!
  • Meta Tags - As above, keep them relevant to the site though,

These are the main areas, it is also worth noting that div tags are generally the preferred method of creating the layout of the page. Tables do work but they are slower to load and you do not have as much control as you do with divs.

That's it for part 2, part 3 will be coming soon.

Wednesday 9 December 2009

SEO Basic Steps - Part 1

When I am looking at a website in order to update it's SEO or starting a site from scratch I have spent ages looking around for the a, b, c and you will get traffic model and it simply doesn't exist. So I had to come up with my own formula to optimize websites in order to make sure I do not miss anything.

This formula (or list of steps) breaks down to 10 stages which I follow in order to check the content. These are:

  1. Select Keywords
  2. Write Content
  3. Add Meta Data
  4. Check & Alter Page HTML Tags
  5. Generate Required Files
  6. Install Trackers
  7. Submit to Search Engines
  8. Submit To Directories
  9. Purchase Additional Domains
  10. Wait For Results

Within the next few articles I will explain how each step will affect your page ranking and what would happen if it was missed.

Select Keywords

There is no simple way of doing this, if you know your product then it is simple. I often start by writing down a list of 5 keywords that people would type in a search engine and the results would show. Doing this gives you a basic set of words for your articles, but remember, in this list avoid stop words (such as and, the it, i etc) because they do not help in the next step.

Because this is such a low number we need to expand this list and this is where keyword analysis tools come in handy such as the google keyword analysis tool.

Using this tool, enter each of your keywords individually and look at what is returned. The aim is to select some phrases which will have a higher search rate than the current ones you have selected.

At this point it is important to note that there is no point in just selecting single words, the best bet is too select some with 3 words, some with 2 and some with a single word in order to generate a decent list of keywords. Save this list somewhere safe, we will need it in a later section.

Write Content

Unfortunately, there is no substitute for well written and meaning full content so you can't skip this step or get a machine to do it for you. The best bet is to take your original list of keywords and use this to base your article on. Try to get at least 2 keyword phrases in the first paragraph will help the rankings for those keywords. This is because when you begin to read a document, you look at the first few paragraphs to see if its worth continuing, the spiders appear to do the same thing. They look at the first sections too see if the content is worth looking at or ignoring as well as continuing to the rest of the document. If you miss this step, you will end up with the first few paragraphs full of complete rubbish and the site won't index well.

I will expand on this article on the next steps and link them too here but that's it for now.

Tuesday 8 December 2009

Google's "Real Time Search"

Today Google has started to launch its real time search results. These are scrolling results displaying the latest information using some sort of AJAX postback so they are automatically updated. The results are appearing in the middle of the serps and there quite interesting to watch.

I am not sure how google selects which results to show and I am sure these facts will come out within time but to actually see real time information is an amazing achievement.

If you can't see the real time news then it might be worth checking out one of Google's other latest tools, google trends. I blogged about this a few weeks back and it allows you to compare items to what was happening at a point in time. Although, its now gone live and shows what people are currently looking for on the internet.

Select a keyword from the trends site (see links) and type it into google. You should be able to view the latest results in a pannel about half way down the rankings. Apparently, to get them to show if you add &esrch=RTSearch to the end of your results they will appear. I've not tried this though.

I hope there is some sort of checking to see the relevance of posts on this system else it is open to abuse from the black hat SEO community pushing sites which have nothing to do with the content just to get a link.

Anyways, its worth checking this out, its quite a landmark in search engine terms (well, for me it is anyway)

Links

www.google.co.uk/trends

Wednesday 2 December 2009

Do Web Standards Help SEO?

I myself have often searched around for a set of do x, y, z and it will improve your rankings but with no concrete results. I've come to the conclusion that good quality SEO begins as soon as you start that first HTML tag. By this I mean that if you follow the standards, your site will always rank well, but what are my reasons for this sweeping statement?


Well, when I write code to import or manipulate data (be it in .Net or some other language), it has to be in a certain fassion or else it will not get split how I want it. One characeter in a possition it was not expecting too be in can cause many issues and throw off the complete algorithum and the remaining data, or in an extreame case cause an error. So it makes sense that in order to index a page, the easier it is for the spider to crawl the site, the better.


Since the spiders are effectivly taking the content and splitting it down into sections so it can pull out what items it needs to read and what it doesn't then the more rigid a document the better. Taking the CSS and XHTML standards seriously gives the spiders a guide of what to look for and where they expect it to be so in theory should index a site well.


Or would it? Since the spiders appear to only pull out the elements they require such as the hyperlink tags () and then scan for heading tags and paragraph tags then why would the document need to be in the correct mannor? It might be that they don't even bother to look towards standards and these web standards are mainly to help browser developers to display pages consitently.

This is only my option but I would say that sticking to the rules for generating a website (i.e. adhearing to web standards) is a must and can only increase your ranking. But perhaps not because the site is well structured, more it is easier for a person to read and reccomend. So do the standards help? I would have to say yes.

Tuesday 1 December 2009

Directory Submission Services - Are they worth it?

We recently undertook a project to provide SEO to a companies website. The owner was getting around 50 hits a week and he wasn't happy. The site is a telecommunications site which services the North Wales/North West area of the UK and its quite a competitive market so we thought that it would be reasonable to add the site to as many directories as possible for 2 reasons.

1) Provides back links to the site, which as we know, increased the page ranking because google See's the site as a better resource.

2) Business directories are always well used, its free advertising and the more you are on, the more change there is of someone looking at that directory for your company.

Now submitting to thousands of these sites would take myself and my colleagues ages. So long in fact that it would cost the client an absolute fortune. Because of this, we decided to enlist the services of a third party company called Directory Maximizer who, for a minimal fee per listing, will submit your site to these directories. They will even handle the final submission email for you as well so you don't need to worry about this.

In our experience, this saved us a lot of time but the pages were slow to be indexed and linked back to the customer. So far, we have submitted to over 1000 directories but have only got around 52 links back from them so around 5%. It's not a bad return and they will increase over time so that's not an issue. The customers happy as well because we can give them a list of sites which they have been submitted too and they can go and see the results for themselves.

From an SEO point of view, would I use this service again? Possibly, it saves myself lots of time and meant we got the information onto those directories for a minimal cost. Being so cheap as well it's also cost effective to get listed. The 5% return doesn't sound like a vast figure but when you consider it took 10 minutes to sign up to the service it is a fair return.

So, on my next projected, if it's a business user and they want a massive insertion into directories then yes I will use the service, if it's a small projected it might be better to target the ones required to maintain quality.

Monday 2 November 2009

Privacy Policy

This website/blog uses third-party advertising companies to serve ads when visiting this site. These third parties may collect and use information (but not your name, address, email address, or telephone number) about your visits to this and other websites in order to provide advertisements about goods and services of interest to you. If you would like more information about this practice and to know your choices about not having this information used by these companies, you can visit Google's Advertising and Privacy page.

If you wish to opt out of Advertising companies tracking and tailoring advertisements to your surfing patterns you may do so at Network Advertising Initiative.Google uses the Doubleclick DART cookie to serve ads across it's Adsense network and you can get further information regarding the DART cookie at Doubleclick as well as opt out options at Google's Privacy Center

Privacy

I respect your privacy and I am committed to safeguarding your privacy while online at this site makemoneyforbeginners.blogspot.com The following discloses how I gather and disseminate information for this Blog.

Log Files and Stats

Like most blogging platforms I use log files, in this case Statcounter. This stores information such as internet protocol (IP) addresses, browser type, internet service provider (ISP), referring, exit and visited pages, platform used, date/time stamp, track user’s movement in the whole, and gather broad demographic information for aggregate use. IP addresses etc. are not linked to personally identifiable information.

Cookies

A cookie is a piece of data stored on the user’s computer tied to information about the user. This blog doesn't use cookies. However, some of my business partners use cookies on this site (for example - advertisers). I can't access or control these cookies once the advertisers have set them.

Links

This Blog contains links to other sites. Please be aware that I am not responsible for the privacy practices of these other sites. I suggest my users to be aware of this when they leave this blog and to read the privacy statements of each and every site that collects personally identifiable information. This privacy statement applies solely to information collected by this Blog.

Friday 11 September 2009

Google PageRank, An SEO Experiment

Just been looking around for information on how to up a Google PageRank to make sure I am doing the correct things and came across a site called 10PageRank.com.

The idea behind it is that people link to the site and they link back to your site in order to increase the PageRank. Once the ranking hits 10 they are donating the domain to whomever has the most referrals.

It’s a good idea for all concerned I think, firstly as an SEO experiment, it will be quite interesting to have a page ranked as 10 and how long it takes for that to happen. From a marketing point of view, you are getting a back link for a highly ranked domain which will provide added weight to your pages.

Its well worth a look and read and its worth adding your domain to get the back links.

Increase your page ranking

Thursday 10 September 2009

Directory Submissions for Back Links And SEO

Recently I have undertaken a project for basic SEO for a customer. Its nothing too fancy compared to the thousands of pounds that some companies charge but it’s a fair amount and so I have directed a large proportion of my time to doing so.

My basic plan was to look over the site and make sure that all the wording was meaningful and related to the company. I also added some geo tag information, checked over the keywords in the meta header (although people say it doesn’t help, older engines may still use this method) and then begin the process of getting back links to the site.

This customer has many related customers on their site already so we asked them to get in contact so we can provide relevant link text to their website in order to generate back links. I also wanted to add the company to directories such as the local business ones to provide simple free back links.

Having done this before I noticed it too an age to start to add them to sites and I figured their must be a simple way of doing this. I began looking for a free tool to do this work but as always, these are hard to come by and the ones that are top of the list cost a fair amount and I didn’t have a guarantee that they did exactly what I wanted.

My next option occurred by me thinking that some of these larger SEO companies can’t employ staff at £6 plus an hour to enter the sites into directories, there must be a company who just specialises in this? So looking in Google I came across a company called Directory Maximizer. The idea behind them is that they take your submission details and put them into over 1200 directories in order to generate the back links.

Being in the UK I did think that these pages would be quite USA specific but there are a few sites which are UK based directories so that helps a lot. I also figured that if I add my clients pages to 1200 directories, it’s a lot faster than me doing the same thing manually and since they do it day in day out I would say they know what they are doing.

Reading further into their site, for a cost of $0.14 per directory they will submit it and for an additional $0.02 per directory they will handle the email you get back to confirm the submission. Since it is a paid site I thought a lot of the directories would be useless but the list is freely available on their site so you can do it yourself if you like.

I signed the website up and paid for the full listing, it took the company about a week to complete all my submissions and the ones that failed, I got a refund on which is good of them. It will probably take about 90 days for all of them to go through but I will keep you updated on how they do. So far, it has increased the number of back links to the site and in time (when Google stop messing with the UK rankings) will help my customers site.

Anyway, for anyone who wants to do a mass directory submission, its well worth a look to consider if it is right for your web site.

Directory Maximizer, Automated Directory Submissions
Ebook About Directory Maximizer

Thursday 3 September 2009

Google Insights For Search

A while ago someone sent me a link to a new section from google called Insights for search. Its in beta at the moment and I dismissed it as an alternative to their keyword analysis tool and filled it in my bookmarks for future use. Whilst looking around today I found the bookmark for the pages and thought I would take a look and see what it actually can do.

First Impressions

On loading the screen it doesn’t look all that interesting so I began by running through some keywords through the pages. I started with computer support since that’s a current site I am trying to market to a wider audience, changed the filter to UK and pressed search. Great, it gives me a graph of search hits, a map of what areas search most and a forecast, but its of no more use than the keyword analysis tool.

On closer inspection though it is quite different. Its more of a tool for marketing folk to compare different search terms. So for my IT company I stuck with computer support and also added another search term of IT Support (not the most exciting terms but it’s a good test). The results given show the trends of search patterns and show that if I had to choose between two terms, IT support would be a better option than computer support for marketing.

Further Information

Suppose am starting a new campaign, we are a small company and there is no reason to market the whole country. Say no one in my region (I’m using England for this section) searches for Computer Support, they could look for IT Support or PC help, you can filter the results down and take a closer look by clicking England and then the Town/City option. This changes the graph and the results in order to show. Looking at the data returned it would be more worthwhile to market IT support compared to the other search terms.


Link to News Stories

Leaving the IT support company for a moment and looking at something which has been in the news quite a lot recently is mobile phones. Nokia always used to be the phone to have when I was a lad and now we have the iPhone so I wonder what the trends are like one these? I’ve added in HTC just to see what that ones like.

The graph shows that there is a steady decline of searches for Nokia and the iPhone has increased to more than Nokia. HTC is meandering along the bottom of the graph. Looking over the results there are large peaks where the iPhone traffic has increased. On the screen there is a option for news headlines on checking this you can see the traffic increase when the iPhone has a news article (and the other items for that matter) so it shows that if something is in the news search ranking increase.

A better explaination of the service can be found at google's pages for this. I don't know if its any use to me yet in SEO terms, but I will keep it updated.

Links

Google Exmplaination Of Insights Search Service
Google Insights For Search Service

Geotagging

I’ve been looking around at different methods for search engine optimization (SEO) in order to try and find a standard set of rules to follow in order to increase a sites ranking.

Now there aren’t any simple do a then b then c and you will get on page 1 rules to SEO, its more of a case of do these few things and one of them might help your ranking but I have come across a method called Geotagging which has interested me.

The idea is that you place information about the location of your company/yourself/image locations in meta data to the pages to aid location specific information. The theory being that one day you might be in your home, looking for a PC repair company, and the geotag information brings back the local results which should be relevant to yourself.

After looking around at various places to find what to put in the geotags I came across the ICMB method (via Wikipedia) for using the tag locations. The basic look of the tags I use is:


<meta name="geo.position" content="<lat>,<long>" />
<meta name="geo.region" content="uk" />
<meta name="geo.placename" content="<address>" />
<meta name="ICBM" content="<lat>,<long>"/>

Next up, I needed to find the GPS co-ordinates for the business location. I’ve found a site called satsig.net. it uses google maps to allow you to zoom into your location and it displays the latitude and longitude below it (it can also tell you what angle to point your satellite dish too to get the best signal). So after zooming in I got the GPS position and added it to the geotag meta data which has now been added to the page.

There are different methods to adding geo tag information to your pages and items on your page in order to give better figures. With the increasing use of GPS locations on phones it is worth looking into and the smarter the search engines get the more location information could be used.

Links

Find your GPS position
Information About Geotags