Robots.txt Generator

Use the form below to generate a robots.txt file for your website. The form will Allow Spiders or Disallow (refuse) them from indexing content of your pages.

Always remember that Allow:/ should be last on the page if added at all.

Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


Robots.txt Content Generator Next Steps:

This is where you add your sitemap.xml file and block pages that you do not want to be indexed by search engines. Out tool will do all the work for you, it's pretty straightforward. Robotx.txt files will help search engines find your sitemap. Blocking pages is considered "suggested", the search engine may still spider the date to some extent and possibly serve in low search instances. If you really want it blocked, password protect your directories.




More Free SEO Tools

Tools