New
Robots.txt Generator
Runs in browserGenerate robots.txt files visually with a rule builder. Control which search engines can crawl your site, block AI scrapers, set crawl delays, and add sitemap URLs.
Robots.txt Generator tool
robots.txt preview
User-agent: * Allow: / Sitemap: https://example.com/sitemap.xml
How to use
Choose a preset
Start with Allow all, Block all, or Block AI scrapers for common configurations.
Add rules
Add Allow and Disallow path rules for each User-agent group. Use * for all bots.
Download
Copy or download robots.txt, then upload it to your website root.
Examples
Block admin paths
Allow crawling but block /admin/ and /private/.
OutputUser-agent: * / Disallow: /admin/ / Disallow: /private/ / Sitemap: https://example.com/sitemap.xml
Frequently asked questions
- Where do I put robots.txt?
- robots.txt must be at the root of your domain: https://yourdomain.com/robots.txt
- Does blocking AI scrapers in robots.txt actually work?
- Reputable crawlers like GPTBot honour robots.txt. However, low-quality scrapers may ignore it. robots.txt is an advisory protocol, not a technical barrier.
Related tools
You might find these useful too.