Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

About the Robots.txt Generator tool:

Another popular tool for bloggers and website owners is the Robots.txt Generator. The Robots.txt file is one of the most important files for any website that wants to direct search engines to its content. You can use this file to tell search engine bots what to crawl and index and what not to. This prevents search engines from indexing duplicate, sensitive, or unwanted content.

Our Robots.txt Generator tool gives you a range of options for generating the exact robots.txt file that you require. It has a list of the most popular search engine bots used around the world.

How does the Robots.txt Generator work?

To use the Robots.txt Generator to generate Robots.txt files, follow these steps:

  • Choose whether to allow or deny Default - All Robots.

  • Select the Crawl-Delay option.

  • Sitemap: (if you don't have one, leave it blank)

  • Determine the desired Search Robots

  • Locate the Restricted Directories

  • Click Robots.text or Create and save as Robots.text to get started.