Robots.txt Generator

Generate a robots.txt file to control how search engines crawl your website.

Used to auto-complete sitemap URLs

User Agents

Crawl Delay

Number of seconds crawlers should wait between requests

Sitemaps

Custom Directives

Any additional directives to include at the end of the file

Generated robots.txt

Configure the settings above to generate your robots.txt file...

Robots.txt Guide

🤖 User-agent

  • * - All crawlers
  • Googlebot - Google's crawler
  • Bingbot - Bing's crawler
  • facebookexternalhit - Facebook's crawler

📁 Path Patterns

  • / - Root directory
  • /admin/ - Admin directory
  • *.pdf - All PDF files
  • /*? - URLs with parameters

⚙️ Directives

  • Allow - Permit crawling
  • Disallow - Block crawling
  • Crawl-delay - Request delay
  • Sitemap - Sitemap location

💡 Best Practices

  • • Place at website root (/robots.txt)
  • • Allow CSS/JS for proper rendering
  • • Block admin and private areas
  • • Include sitemap references