How can we help you?
What to watch out for when using Crawler Control tool?
The Golden Rule of Thumb:
- Always test before deploying changes. A single mistake can block your entire site from search engines.
- Expect delays. Search engines may take days or weeks to notice changes.
- Back up your existing robots.txt before editing it.
Best practices
- Start conservative. It’s easier to allow more later than to recover from blocking too much.
- Make sure your sitemap link actually works.
- Use the path testing feature to verify rules before going live.
- Monitor rankings and traffic after changes.
Rules you should never block
Blocking CSS, JavaScript, or images for Googlebot breaks proper page rendering and can hurt mobile-first indexing.
Blocking AI training crawlers safely
If you want to prevent AI companies from training on your content, block their training bots while still allowing search engines like Googlebot and Bingbot.