Another popular tool for bloggers and website owners is the Robots.txt Generator. The Robots.txt file is one of the most important files for any website that wants to direct search engines to its content. You can use this file to tell search engine bots what to crawl and index and what not to. This prevents search engines from indexing duplicate, sensitive, or unwanted content.
Our Robots.txt Generator tool gives you a range of options for generating the exact robots.txt file that you require. It has a list of the most popular search engine bots used around the world.
Choose whether to allow or deny Default - All Robots.
Select the Crawl-Delay option.
Sitemap: (if you don't have one, leave it blank)
Determine the desired Search Robots
Locate the Restricted Directories
Click Robots.text or Create and save as Robots.text to get started.