Back to all tools
Robots.txt Generator
Generate robots.txt files to control search engine crawling
Time between bot requests. Leave empty for no delay.
Rule Sets
Generated robots.txt
User-agent: * Disallow: /admin/ Disallow: /private/ Allow: /
Common User Agents
* - All bots
Googlebot - Google
Bingbot - Microsoft Bing
Slurp - Yahoo
DuckDuckBot - DuckDuckGo
facebookexternalhit - Facebook
Twitterbot - Twitter
Common Paths to Disallow
/admin/
/private/
/api/
/wp-admin/
/login/
/cart/
/checkout/
/*.pdf$
Advertisement
How to Use
- Add rule sets for different user agents (bots)
- Specify paths to allow or disallow crawling
- Add your sitemap URL for better indexing
- Set crawl delay if needed to reduce server load
- Download or copy the generated robots.txt file
- Upload to your website root directory