Back to Tools
robots.txt Usage Guide
Installation
- Download the generated robots.txt file
- Upload it to your website's root directory
- Access via: yoursite.com/robots.txt
- Test with Google Search Console
Best Practices
- Keep rules simple and clear
- Test thoroughly before deploying
- Include your sitemap URLs
- Monitor crawl behavior regularly
Important Notes
- robots.txt is publicly accessible
- Don't rely on it for security
- Some bots may ignore the rules
- Test with Google's robots.txt Tester
robots.txt Reference
Basic Directives
- User-agent: Specify which bots
- Disallow: Block access to paths
- Allow: Explicitly allow access
- Sitemap: Declare sitemap URLs
- Crawl-delay: Request delay between visits
Common User Agents
- *: All search engines
- Googlebot: Google search
- Bingbot: Microsoft Bing
- Slurp: Yahoo search
- DuckDuckBot: DuckDuckGo
Path Patterns
- /: Root directory
- /admin/: Admin folder
- *.pdf: All PDF files
- /api/*: API endpoints
- /?: Parameters