llms.txt Generator
Paste a URL, fetch content via CORS proxy, generate llms.txt or llms-full.txt
What is llms.txt?
llms.txt is a Markdown text file placed in the website root directory (similar to robots.txt), specifically designed for large language models (LLMs) and AI crawlers. It summarizes the website's core information and page links in a structured, readable way, helping AI systems like ChatGPT, Claude, and Perplexity quickly understand the site's content structure and purpose.
This specification was proposed by Jeremy Howard of Answer.AI in 2024, aiming to provide a standardized 'website manual' for the AI era.
- Contains website name and one-line description
- Lists links and summaries of all important pages
- Small size, AI can read it entirely within the context window
- Suitable for daily use on most websites
- Contains the full Markdown body content of each page
- AI can get all information without accessing the original page
- Larger file size, suitable for websites with few pages or documentation sites
- Ideal for scenarios where AI needs deep understanding of each page's details
What is it for?
🔍 Boost AI Search Visibility
When users ask questions in AI search engines like ChatGPT or Perplexity, llms.txt helps AI more accurately understand and cite your website content.
⚡ Lower Crawling Costs
AI crawlers can read structured Markdown directly without parsing HTML page by page, reducing server load and improving indexing efficiency.
📋 Unified Content Entry
Provides a standardized content entry point for AI agents and RAG systems, making it easy to integrate into various AI workflows.
Usage tip:After generation, upload the llms.txt file to your website root directory (e.g., https://yoursite.com/llms.txt) and add Sitemap: /llms.txt to your robots.txt to improve discoverability. It is recommended to update this file whenever your website content changes.
