The robots.txt file is a file that is created by the user which instructs various spiders or web crawlers on what area of the place you are allowed to visit, and what sections are not allowed to visit.
A popular page that many details of how the robots.txt file is here: http://www.robotstxt.org
You can also use the Robots META tags from within your HTML pages to instruct some spiders or web crawlers on whether to index the page in the search by the following:
A handy utility that we found on the web was a place that will make the robots.txt file for you. http://www.1-hit.com/all-in-one/tool-robots.txt-generator.htm
The use of step by step guide that explains the process of creating a robots.txt file on your site. When you create the file, simply upload to your site in the folder / public_html and search engines will find it.
The robots.txt file is not supported by all search engines