Robots.txt creator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt creator

What is Robots.txt?

well robots.txt is text file which contain special set of rules, which are read by every search engine spider when they visit your website. So this robots.txt contain various rules like which directory you are allowed to scan and which directory is not allowed to scan and index and similar applies to files , webpages etc that you not want to appear in search engine for public.So this robots.txt file play key role in protecting your website from attackers like you can declare admin panel address and other senstive directories that you want not to appear in search engines.

What is Useotools Robots.txt Creator?

So now question is how to write rules in this robots.txt file, well its not that easy for begineers and also it consume time to write robots.txt file. So Useotools presents free tool Robots.txt Maker with which you have can create robots.txt file in no time in few click. So the tool have different options which i will explained below.

  • Default - All Robots are : so this option have two choices "Allowed" and "Refused". So if you want that all search engine robots can visit and scan your website than set it to "Allowed" but interent is not that good that are some bad bots around so if you want to blacklist certain robots/spiders than set it to "Refused".
  • Crawl-Delay : So Crawl Delay is very important rule. it allow spiders to delay scan by specifc time , like if you have big site and have big sitemap you do't want to overload the server as the spider will be crawling your site simultaenously which will increase load on server. So Crawl Delay is good to set. so that spiders crawl your website slowly and not cause load on server.
  • Sitemap: Sitemap is also very important rule , if your website is big then you must keep sitemap for your website so that search engines spider knows what to crawl its just like city map for new vistor :). So you can put your sitemap url here if you have sitemap for your website. -> Search Robots: Here we have list of search engine robots/spiders which you want to allow and refuse.
  • Restricted Directories : Here you can define restricted directories name and paths that you want to keep away from search engines to scan and looks inside them.


FOLLOW US
ANNOUNCEMENTS
SPONSOR