Use the form below to generate a robots.txt file for your website. The form will Allow Spiders or Disallow (refuse) them from indexing content of your pages.
Always remember that Allow:/ should be last on the page if added at all.
This is where you add your sitemap.xml file and block pages that you do not want to be indexed by search engines. Out tool will do all the work for you, it's pretty straightforward. Robotx.txt files will help search engines find your sitemap. Blocking pages is considered "suggested", the search engine may still spider the date to some extent and possibly serve in low search instances. If you really want it blocked, password protect your directories.