Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

What is Robot.Txt Generator?      

When search engines crawl a site, they first look for the robots.txt file at the root of the domain. If found, they read the list of instructions from the file to see which directories and files, if any, have been crawled. This file can be created from a robots.txt file generator. When you use the robots.txt generator, Google and other search engines can determine which pages on your site should be excluded. In other words, a file created by a robots.txt generator is the opposite of a sitemap, which shows which pages to include.

About Robots.txt generator.             

You can easily create a new robots.txt file or edit an existing file for your site with a robots.txt generator. To upload an existing file and pre-populate the robots.txt file generator tool, type or paste the root domain URL in the text box above and click upload. Use the robots.txt generator tool to create instructions for user agents to allow or disallow specific content on your website (permission is the default, click to change). Click Add Policy to add a new policy to the list. To edit an existing policy, click Delete Policy and then create a new policy.

Create custom user agent policies.

In our robots.txt generator, Google and various other search engines can be customized according to their criteria. To specify alternative policies for the crawler, click on the User Agent list box (showing by default) to select the boot. When you click Add Policy, a custom section is added to the list that includes all common policies with the new customs policy. To replace a general license for a custom user agent, create a new user-specific permission agent for the content. The matching Disallow directive for the Customer User Agent has been removed.

For more information on robots.txt instructions, see the Final Guide to Blocking Your Content in Search.

You can also add a link to your XML based sitemap file. Type or paste the full URL of the XML sitemap file into the XML sitemap text box. Click Refresh to add this command to the list of robots.txt files.

When done, click Export to save your new robots.txt file. Use FTP to upload files to your site's domain root. With this file uploaded from our robots.txt generator, Google or other specific sites will know which pages or directories on your site should not appear in user search.