Back
Robots.txt Generator
Create and customize your robots.txt file to control search engine crawlers
Default Settings
Allow all robots to crawl all content
Disallow all robots from crawling all content
Custom rules (add below)
Sitemap
Add Sitemap
Crawl-delay
seconds
Optional: Set delay between crawler requests (1-30 seconds)
Custom Rules
Add Rule
Generate Robots.txt
Generated Robots.txt
Copy to Clipboard
Download robots.txt
Robots.txt Tips:
Place the robots.txt file in your website's root directory
Use specific paths in Disallow rules to prevent access to private content
Include your XML sitemap URL to help search engines find your content
Test your robots.txt using Google's robots.txt Tester tool
Be careful with wildcards (*) as they can unintentionally block content