Robots.txt Generator is a free online tool that builds a valid robots.txt file from your custom rules. Define user-agent directives, disallow and allow paths, crawl delays, and sitemap URLs without writing code.

A robots.txt file is a plain text file placed at the root of a website that instructs search engine crawlers which pages or sections to crawl or skip. It is part of the Robots Exclusion Protocol and is read by bots like Googlebot before they crawl a site.

Robots.txt Generator

Generate a robots.txt file for your website with custom rules for search engine crawlers.

Rule 1

How to Use

  1. Set the User-agent field (use * for all bots or a specific bot name)
  2. Enter paths to disallow, one per line (e.g. /admin/)
  3. Optionally add allowed paths and a crawl delay
  4. Add your sitemap URL if available
  5. Click Generate robots.txt and copy the output to your site root

Example

Input

User-agent: * | Disallow: /admin/ | Sitemap: https://example.com/sitemap.xml

Output

User-agent: *
Disallow: /admin/

Sitemap: https://example.com/sitemap.xml

FAQ

What is a robots.txt file?
A robots.txt file tells search engine crawlers which pages they are allowed or not allowed to crawl on your website.
Does robots.txt block all bots?
No. robots.txt is a guideline, not a security measure. Malicious bots may ignore it. Use server-level access controls to truly block access.
Is this tool free?
Yes, it is completely free with no registration required. Everything runs in your browser.

Related Tools