MediaGrab
TOOLS
Downloader
All Tools
Back to Home
Robots.txt Generator
Control where search engines can go on your site
Configuration
Default Access (All Robots)
Allow Everything
Disallow Everything
Crawl-Delay (Seconds)
No Delay
5 Seconds
10 Seconds
20 Seconds
60 Seconds
Sitemap URL (Optional)
Specific Disallowed Paths (One per line)
Refresh Output
Generated Output
User-agent: * Allow: /
Copy to Clipboard
Download robots.txt
How to use:
Set default access to "Allow Everything" for most sites.
Add specific paths you want to hide (like login pages) in the box.
Add your sitemap URL to help Google find your pages.
Download the file and place it in your website's root folder (e.g., example.com/robots.txt).