Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Welcome toSEOtoolsolution Robots.txt Generator Tool, a tool that would be very useful for webmasters for making the website's Googlebot friendly.

SEOtoolsolution Robots.txt Generator Tool has lives of webmasters easy to doing very hard and complicated task with just a few clicks, Robots.txt Generator tool generates the Googlebot friendly robots.txt file for the website.

Robots.txt tool has a user-friendly interface, and you have the option to select which things should be included within the robots.txt file and which not. Here's how you can make use of this SEO tool:

  • Choose whether you want all robots or just some robots to have access to your site's files. All robots are allowed by default.
  • Choose how much delay there should be in the crawls. You can select your preferred option from 5 seconds to 120 seconds. By default, we have set it 'no delay'. 
  • If you have your site's sitemap, then paste it here. Otherwise, you can proceed.
  • Select which bots can crawl your web and choose any bot which you don't want to crawl your site's files. 
  • The last step is to restrict directories. The path is relative to root and must contain a trailing slash "/". 
  • After you have generated Googlebot friendly robots.txt file using our Robots.txt Generator Tool, it's the time to upload it to the root directory of your site.

Using SEOtoolsolution Robots.txt generator, webmasters can instruct any robots which records or files in your website's root index should be crawled crept by the Googlebot. You can even pick which particular robot you need to have entry to your site's index and confine different robots from doing likewise. You can also instruct that which robot should get access to files your site's root catalog and which robot should get access to another file.