Toolkit / Developer Tools

Robots.txt Generator

Create simple robots.txt rules to guide search crawlers on which parts of a site they may crawl.

Works in browserFast page

Privacy note: Developer inputs are handled locally for supported tools. Avoid pasting production secrets, private keys, or credentials into any tool unless you understand the risk.

Limitations: Robots.txt is public and should not reveal private paths you want to hide.

Robots.txt Generator

Draft simple robots.txt rules for websites.

robots.txt

User-agent: *
Disallow: /admin
Disallow: /api/private

Sitemap: https://example.com/sitemap.xml

How robots.txt generator works

Enter allow or disallow rules, add an optional sitemap URL, and copy the generated robots.txt file.

Robots.txt gives crawler instructions, but it is not a security or access-control feature.

Privacy

Developer inputs are handled locally for supported tools. Avoid pasting production secrets, private keys, or credentials into any tool unless you understand the risk.

Limitations

  • Robots.txt is public and should not reveal private paths you want to hide.
  • Well-behaved crawlers follow robots.txt, but malicious bots may ignore it.
  • Blocking a page in robots.txt does not always remove it from search results if other pages link to it.

FAQs

Where does robots.txt go?

Place it at the root of your site, such as https://example.com/robots.txt.

Can robots.txt protect private pages?

No. Use authentication or proper access controls for private content.

Should I include a sitemap?

Yes, adding your sitemap URL helps crawlers discover important pages.

Can I block all crawlers?

You can disallow crawling, but compliant behavior depends on the crawler.