Advertisement

Robots.txt Generator

Create a valid robots.txt file to control how search engines crawl your website.

Quick Presets
Allow All
Block All
Standard Blog
E-commerce
Single Page App
Sitemap URLs
Crawl Delay (optional)
Delay (seconds)
Generated robots.txt
Advertisement

About robots.txt

The robots.txt file tells search engine crawlers which URLs they can access on your site. It's placed in your website's root directory (e.g., https://example.com/robots.txt).

Common Directives

  • User-agent — Specifies which crawler the rules apply to (* = all)
  • Allow — Permits crawling of a specific path
  • Disallow — Blocks crawling of a specific path
  • Sitemap — Points crawlers to your XML sitemap
  • Crawl-delay — Time (seconds) between requests
Advertisement