Translate

Robots and SEO | Robots file|Robots Tips and Tricks|SEO :google,yahoo,msn

Many Web sites don’t offer robots.txt, Although this file is very important for
SEO if you want to have a good ranking on search engines,



Most SEO services providers offer robots.txt as one of their service,but after you read this article, you will know how to create the robots.txt yourself!


What is robots.txt?

When a search engine comes to crawl to your site, it will look for a special file which is called robots.txt
on your site. then it tells the search engine spider, which Web pages of your site should be indexed and which should be ignored.
The robots.txt file is a simple text file with no HTML code and must be placed in your root directory , e.g
http://www.yourwebsite.com/robots.txt
So next you must care about how to create a robots.txt file.

How to create a robots.txt file?

The robots.txt file is a simple text file. Open a text editor to create it. The content consists of so-called “records”.

Each record contains the information for a special search engine. It consists of two fields: the user agent line and one or more Disallow lines. Here’s an example:

User-agent: googlebot
Disallow: /cgi-bin/

This robots.txt file would allow the “googlebot”—the search engine spider of Google, to retrieve every page from your site except for files which will be ignored by googlebot from the “cgi-bin” directory .

The Disallow command works like a wildcard. If you enter

User-agent: googlebot
Disallow: /text

both “/text-desk/index.html” and “/text/index.html” as well as all other files in the “text” directory would ignored by search engines.

You’re telling the search engine that all files on your website should be indexed if you leave the Disallow line blank. However,in any case, you must enter a Disallow line for every User-agent record.

If you want to give all search engine spiders ( like google,yahoo and so on) the same rights, use the
robots.txt content as follows:

User-agent: *
Disallow: /cgi-bin/

Where to find user agent names?

By checking for requests to robots.txt, You can find user agent names in your log files. Generally, all search engine spiders should be given the same rights. so, use “User-agent: *” as mentioned above.

Things should be avoided

And you must avoid some things, If your robots.txt file isn’t formated properly, some or all your files might not get indexed by search engines. So you shold do the followings to avoid this:

1. You should not use comments in the robots.txt file
Comments might confuse some search engine spiders, Although thay are allowed in a robots.txt file,
“Disallow: support # Don’t index the text directory” might be misinterepreted as “Disallow: support#Don’t index the text directory”.

2. White space is not allowed at the beginning of a line. For example, don’t write

placeholder User-agent: *
place Disallow: /text

but set as this:

User-agent: *
Disallow: /text

3. If your robots.txt file should work, don’t mix it up,So don’t change the order of the commands like this:
Disallow: /text
User-agent: *

but you should give t=it the right order:

User-agent: *
Disallow: /text

4. Don’t use more than one directory in one Disallow line:

User-agent: *
Disallow: /text /cgi-bin/ /images/

Search engine spiders cannot understand this format. The correct syntax should be as follows:
User-agent: *
Disallow: /text
Disallow: /cgi-bin/
Disallow: /images/

5. The file names on your server are case sensitve.So make sure they are in the right case. Don’t write

“Text” in the robots.txt file,If the name of your directory is “text” on the server.

6.If you don’t want search engine spiders to index all files in a special directory, you don’t have to list

all files like follows:

User-agent: *
Disallow: /text/orders.html
Disallow: /text/technical.html
Disallow: /text/textdesk.html
Disallow: /text/index.htm

You should replace this with

User-agent: *
Disallow: /text

7. Don’t use “Allow” command

There’s no “Allow” command in the robots.txt file. You should write the files you don’t want them to be

indexed. All others will be indexed automatically if they are linked on your site.

Tips and tricks:

1. How to allow all search engine spiders to index all your files:

You can do the following settings:
User-agent: *
Disallow:
The search engine spiders will index all your files.

2. How to avoid all your files indexing by spiders, You can do as follows:
User-agent: *
Disallow: /

Then the search engines will not index any of your web site files.

3. Where to find more complex examples.

View the robots.txt files of big Web sites to see more complex examples:

http://www.pal-stu.com/robots.txt
http://www.cnn.com/robots.txt
http://www.nytimes.com/robots.txt
http://www.spiegel.com/robots.txt
http://www.ebay.com/robots.txt




If you want to have good rankings on search engines.Your site should have a proper robots.txt file . After search engines know what to do with your pages, they can give your site a good ranking.

SEO Tips And tricks| SEO Secrets | SEO best practice

Top 10 SEO Tips and Tricks

Tip 1 of my Top 10 Tips and Tricks for building a search engine optimized (SEO) website from the ground up is thorough Keyword Research.
Keyword Research is the process of using keyword research tools to determine the value of a particular keyword or keyword phrase.
Keyword research is critical to the SEO process. Without keyword research all your efforts to optimize your site to rank high in the major search engines may be directed to the wrong terms and phrases. This will result in rankings for words that no one searches on.
The process of keyword research involves several steps.
The first thing you are going to want to do for keyword research is brainstorm the word or words potential customers might use to find the site you are planning to build. This list will include both individual words and 2 to 4 word phrases. You will need enough keywords and keyword phrases to cover all the services or product areas you are going to promote on the site.
If you are having trouble coming up with enough keywords or keyword phrases use past customers, your friends, colleagues, or even your competitors. You can check the “meta keywords” used by your competitors. Most browsers will let you do this by right clicking on the page and choosing “view page source”.
Meta Keywords
Now the work begins. You need to begin the process of qualifying your keyword list.
By checking your proposed keywords with a keyword research tool, you can easily learn how many users are conducting searches for that particular term every day, how many of those searches actually converted, and other important analytical information. You may also discover words that are different than your keywords that searchers are using instead of yours or even synonyms you weren’t even aware of.
There are plenty of good tools out there to help you determine how valuable your keywords are to your site. Here’s a few of my favorites.
SEO Keyword Research Tool A free research tool on SEO Portland and powered by Arron Wall. This is a good tool to begin with.
Wordtracker: Wordtracker is a subscription tool that lets you look up popular keyword phrases to determine their activity and popularity among competitors. Their top 1000 report lists the most frequently searched for terms, while their Competition Search option provides valuable information to determine the competitiveness of each phrase. They also have a free trial version but be prepaired to spend some time using it.
Overture Keyword Selector Tool: Overture now Yahoo!’s Keyword Selector tools shows you how many searches have been conducted over the last month for a particular phrase and lists alternative search terms you may have forgotten about. The only problem with Yahoo!/Overture is that singular and plural word forms are considered one phrase. For example, "widgets" and "widget" would appear as the same word "widget".
Google AdWords Keyword Tool: Google’s keyword Pay-Per-Click tool doesn’t provide actual search numbers for keywords. Instead, it displays a graph like, colored bar, giving users an approximation.
Keyword Discovery tool: This is a fee-based tool where users can see how many users search for it daily, identify common spellings and misspellings, and determine the market share value for a given search word or phrase.
Google Suggest: Google Suggest is a great way to find synonyms and related word suggestions that may be helpful to expand your original list.
Thesaurus.com: A good way to discover synonyms you may not have remembered.
NicheBot is the best keyword tool I have been able to find it includes results from Wordtracker, Google, Keyword Discovery, and Yahoo!/Overture. This tool saves me alot of time and effort. Their tagline is, "Search The Major Keyword Services All in One Place to Easily Locate the Highest Traffic/Lowest Competition Keywords Like a Laser-Guided Missile!". They even have a 2 week test drive for a buck. NicheBOT Tour Page for $1 Trial Offer.
Remember that you’re not only checking to see if enough people are searching for a particular word or phrase, you’re also trying to determine how many of your competitors are using that keyword or phrase on their sites.
The next edition will be tip 2 Keyword Selection. I will go into detail about what to do with the list you have it compiled and researched.
late,
gary pool