Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

The Advanced Robots Txt Generator is a free online SEO tool that enables you to quickly and easily create your robots.txt file for your site. The robots.txt file is a simple yet vital text file that tells search engine robots which parts of your website they’re allowed to crawl and index. The file is usually placed in your root directory, i.e. http://yourdomain.com/robots.txt.

Why is a Robots.txt Important?

Here are some reasons why you would want to generate robots.txt file for your site:

  1. Stop search engine crawlers from visiting areas where there is no useful information for them. Some areas you may wish to exclude from robots include images, CGI-bin, login pages, and areas that are still in development and are not ready for public viewing yet.
  2. You can use the robots.txt file to avoid duplicate content issues which would otherwise put you in ill-favour with some search engines.
  3. You need a working, validated robots.txt seo file to get Google Webmaster Tools. These webmaster tools are valuable since they provide insight into what Google thinks of your website.
  4. Using robots.txt generator helps minimize your bandwidth consumption when you prevent crawlers from visiting certain areas of your site that are not useful to them.

How to Use the Free Robots Txt File Generator?

  1. Select which robots to allow or block. You can choose to allow/block all search robots or only allow/block some specific ones. By default, all search robots are allowed.
  2. Set your preferred crawl delay. This is the frequency you want bots to crawl your site and can help prevent an overload of your servers especially if your content constantly changes.
  3. Add the site directories to be restricted. This is where you set the areas of your site that you do not wish made public.
  4. Save and upload your seo robots.txt file. After you create your robots.txt file, copy and paste the code into a blank text file and name it "robots.txt" . Next upload it to your site’s root directory.


SEMrush