Free Tool — No Signup Required

Generate your
llms.txt in 30 seconds

We crawl your site, classify your pages, and write contextual descriptions automatically. Deploy the result and every AI platform knows exactly how to read your content.

Version 1.0 | Published 15 April 2026 | Last verified: 15 April 2026 | Source: citedbyai.info AI Visibility Intelligence

Enter your root domain — no https:// needed. We crawl your sitemap automatically.

Required. One free generation per domain. We will send you the files.

Your llms.txt is ready
0 pages discovered across 0 sections
llms.txt

    
llms-full.txt

    

How to deploy

1
Upload llms.txt to your site root so it is accessible at https://yourdomain.com/llms.txt
2
Upload llms-full.txt to the same location — accessible at https://yourdomain.com/llms-full.txt
3
Add this line to your robots.txt: LLMs-txt: https://yourdomain.com/llms.txt
4
Verify both files load as plain text in your browser. GPTBot, ClaudeBot, and PerplexityBot will discover them within 24-48 hours.

This is the infrastructure layer. The audit is the intelligence layer.

llms.txt tells AI crawlers where to look. A full CPS® audit tells you which of those pages will actually be cited — and fixes the ones that won't.

Get the Full CPS® Audit →

How it works

01 —

Live Crawl

We fetch your sitemap.xml server-side. No CORS issues. No manual URL entry required.

02 —

URL Classification

Each URL is classified into services, about, locations, case studies, FAQs, blog, or legal — exactly how AI crawlers expect content to be organised.

03 —

AI Descriptions

Claude Haiku writes a one-sentence contextual description for each priority page — not a generic label, a real description.

04 —

Two Files

llms.txt covers priority content. llms-full.txt covers every indexable page. Both follow the llmstxt.org spec exactly.