
Site Load Tester – version 1.3.2
New: Crawler Control Tool
This release adds a new Crawler Control tool to help you understand and manage how crawlers interact with your site. It includes a two‑tab interface for robots.txt and llms.txt, using a shared URL input.
robots.txt
- Automatically fetches and parses robots.txt from any URL
- Lists all detected user‑agents and their rules
- Shows Allow/Disallow rules, crawl delays, and sitemap directives
- Highlights blocked crawlers and their restrictions
- Flags syntax errors, warnings, and best‑practice issues
- Lets you test specific URLs to see whether they are allowed or blocked
llms.txt
- Detects AI and LLM crawlers like GPTBot, Claude‑Web, and Anthropic‑AI
- Shows file metadata (size, last modified, content type)
- Provides a raw content preview
- Offers recommendations for better AI crawler control
UI improvements
- Tab-based navigation between robots.txt and llms.txt
- Color‑coded status badges for quick scanning
- Scrollable result sections for easier review