About LLMs.txt Generator
A purpose-built tool for AI marketing agencies to generate spec-compliant llms.txt files for client sites.
What is LLMs.txt?
LLMs.txt is a standardized format that lets websites present their content in a clean, structured way optimized for AI assistants and language models. As AI becomes a primary way people discover and interact with information, having an llms.txt file ensures your clients' content is accurately found and cited by those systems.
The format follows the official specification at , ensuring compatibility across AI systems and applications.
How the Pipeline Works
Jobs run fully in the background through a four-step pipeline — so you can submit a job for one client and immediately move on to the next.
- 1. Map — Discovers every URL on the site (up to 10,000 pages).
- 2. Scrape — Crawls each page and extracts clean content, stripping navigation, ads, and boilerplate. Failed pages are automatically retried.
- 3. AI Describe — Optionally generates AI-written titles and descriptions for pages that are missing metadata.
- 4. AI Format — Optionally uses AI to organize pages into logical sections, producing a polished, structured llms.txt file.
You're emailed when the job completes. Every job stays in your history — re-download any client file at any time without re-running the job.
Output Formats
Every job produces up to three versions of the llms.txt file depending on which features are enabled:
Raw — Scraped page titles and descriptions exactly as found on the site. Available on all plans including free.
AI Titles & Descriptions — Same structure as raw, but pages missing metadata get AI-generated titles and descriptions. Paid plans only.
AI Formatted — Pages are organized into logical sections with a clean hierarchy — the most polished deliverable for clients. Paid plans only.
All three versions are available to download from your job history.
Standalone Tools
Format an existing file: Already have an llms.txt file from a previous generation or another tool? Use the to organize it into sections with a custom prompt.
Retry failed URLs: If a job completed but some pages failed to scrape, use the to reprocess only those URLs and merge the results back into your existing file.