Wednesday, January 16, 2008

Hassle Free Ranking


RSS FEED

Intelligent Ways of Improving Page Rankings with keywords,Internet Marketing,mlm leads,mortgage leads,work from home,working from home,leads,
Leadsomatic and Veretekk . Used by Shaun McClelland in building his Internet Business.

Page Ranking with massive Linking Sites Smile

Using Robots.txt to your advantage

Working outside of
Veretekk and Leadsomatic you woud have to go though all the hassels of the info below. Shaun McClellands advise is Join Veretekk as a Gold member and pump out your website daily using Leadomtic to generate leads and also pump up your rankings with Google safely and securly.

However because search engine spiders crawl through sites indexing every page it can find, it might come across your search engine specific optimised pages and because they are very similar, the spider may think you are spamming it and will do one of two things, ban your site altogether or severely punish you in the form of lower rankings.

The solution in this case is to stop specific Search Engine spiders from indexing some of your web pages. This is done using a robots.txt file, which resides on your web space. A Robots.txt file is a vital part of any webmasters battle against getting banned or punished by the search engines if he or she designs different pages for different search engine's.

The robots.txt file is just a simple text file as the file extension suggests. It's created using a simple text editor like notepad or WordPad; complicated word processors such as Microsoft Word will only corrupt the file.

The User-Agent is the name of the search engines spider and Disallow is the name of the file that you don't want that spider to index. You have to start a new batch of code for each engine, but if you want to list multiply disallow files you can one under another. For example –

User-Agent: Slurp (Inktomi's spider)
Disallow: xyz-gg.html
Disallow: xyz-al.html
Disallow: xxyyzz-gg.html
Disallow: xxyyzz-al.html

The above code disallows Inktomi to spider two pages optimized for Google (gg) and two pages optimized for AltaVista (al). If Inktomi were allowed to spider these pages as well as the pages specifically made for Inktomi, you may run the risk of being banned or penalized. Hence, it's always a good idea to use a robots.txt file. The robots.txt file resides on your webspace, but where on your webspace? The root directory! If you upload your file to sub-directories it will not work. If you wanted to disallow all engines from indexing a file, you simply use the * character where the engines name would usually be. However beware that the * character won't work on the Disallow line.

Here are the names of a few of the big engines:
Excite - ArchitextSpider
AltaVista - Scooter
Lycos - Lycos_Spider_(T-Rex)
Google - Googlebot
Alltheweb - FAST-WebCrawler

Be sure to check over the file before uploading it, as you may have made a simple mistake, which could mean your pages are indexed by engines you don't want to index them, or even worse none of your pages might be indexed.

Shaun McClelland SEO and LEADS Specialist next post looks at more ways to use robots.txt

ADD a LINK

Choose a Lead Generating Portal

No comments: