Tuesday, 29 December 2009

SEO Basic Steps - Part 3

Now Christmas is out of the way I can crack on and get this series of articles finished. In this section I am going to write about Generate Required Files for the search engines and the use of trackers and how to install them.

Required Files

There are 2 main files I generate for each site. A robot.txt file and also a sitemap file. If you are using a blogging platform, these files are often created for you which saves time. For those of you optimizing your own content then it is beneficial to create your own files. There are simple ways to do this for each file.
  • robots.txt - This file aims to tell the spiders where they can and can't go. If you have a directory full of confirmation pages which you do not want the spiders to index then you can block them using this file. It is a simple text file which contains Allow:/ and disallow:/. A good generator is listed at the bottom of this page.
  • Sitemap - This is a list of pages on your site in an XML format. It allows easy indexing and finding of your pages by the spiders. Remember to update this if you add pages to your site.

Both of these files sit in the root of your website, you can view the sitemap of pages by adding /sitemap.xml to most url's which will display a sitemap.


Since you are going to want to know what traffic is going to your site a tracker is needed. This can come from google's analytics service which is an excellent resource to find out who is visiting your site and from where.

The site asks you to place a tracking code just above the close body tag of your website which calls some javascript to post your information to google so they can generate information. Now, this will not increase your rankings but it will allow you to see if what you are changing is performing how you expect.

That's it for another article. I will add part 4 soon which should be the end of the whole process.


No comments:

Post a Comment