robots.txt Generator Generate the robots.txt file for your website
robots.txt is also known as the robots exclusion standard and the robots exclusion protocol. robots.txt is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.
robots.txt is not used to "hide" data.
- In the top section at the left, select each of the crawlers you want to identify through robots.txt ... you can select "All" for all of the crawlers.
- Select your crawl delay ... keep in mind that the longer you specify, the less data crawlers will be able to find in the alloted time and span.
- In the Paths/Directories section, type in each directory you want to restrict access to. Each path has to start with a slash and end with a slash as in '/cgi-bin/'
If you find this tool useful, please consider a donation to keep development of free tools going. Click the link below.