robots.txt Generator

Create SEO-friendly robots.txt files to control search engine crawling

Back to Tools

Quick Presets

Start with a common robots.txt configuration

User Agent Rules

Define crawling rules for different search engines and bots

Allow Rules
Disallow Rules

Advanced Settings

Configure crawl delays, host preferences, and sitemaps

Delay between requests (optional)
Preferred domain version

Sitemap URLs

Tell search engines where to find your sitemaps

Custom Rules

Add any additional custom robots.txt directives

robots.txt Usage Guide

Installation

  1. Download the generated robots.txt file
  2. Upload it to your website's root directory
  3. Access via: yoursite.com/robots.txt
  4. Test with Google Search Console

Best Practices

  • Keep rules simple and clear
  • Test thoroughly before deploying
  • Include your sitemap URLs
  • Monitor crawl behavior regularly

Important Notes

  • robots.txt is publicly accessible
  • Don't rely on it for security
  • Some bots may ignore the rules
  • Test with Google's robots.txt Tester

robots.txt Reference

Basic Directives

  • User-agent: Specify which bots
  • Disallow: Block access to paths
  • Allow: Explicitly allow access
  • Sitemap: Declare sitemap URLs
  • Crawl-delay: Request delay between visits

Common User Agents

  • *: All search engines
  • Googlebot: Google search
  • Bingbot: Microsoft Bing
  • Slurp: Yahoo search
  • DuckDuckBot: DuckDuckGo

Path Patterns

  • /: Root directory
  • /admin/: Admin folder
  • *.pdf: All PDF files
  • /api/*: API endpoints
  • /?: Parameters