How to Create llms.txt
This tutorial walks you through creating and deploying an llms.txt file for your website. By the end, AI systems will be able to quickly understand what your site contains.
To create llms.txt, place a Markdown-formatted file at your site root (/llms.txt) containing your site name, description, content sections, and key page links with descriptions. Keep it concise and update it when content changes.
Overview
What you'll build: A machine-readable content index for AI systems.
Who this is for: Developers, technical SEOs, and site owners who want AI systems to better understand their site.
Time required: 15–30 minutes for initial setup.
Requirements
- Access to your site's root directory or public folder
- A text editor
- Knowledge of your site's most important pages
- Basic Markdown familiarity
Step-by-Step
Step 1: Create the file
Create a new file named llms.txt in your site's root/public directory.
Next.js / React:
your-project/
├── public/
│ └── llms.txt ← create hereWordPress:
Upload to your root directory (same level as wp-config.php), or use a plugin.
Static sites:
Place in your build output directory root.
Step 2: Write the header
Start with your site name and description:
# Your Site Name
> A brief description of your site, what it covers,
> and who it's for. Keep this to 1-2 sentences.The # heading is your site's identity for AI systems. The > blockquote provides context.
Step 3: Add content sections
Group your pages by topic using ## headings:
## Getting started
- [Introduction](https://yoursite.com/intro): What this
product does and why it exists.
- [Quick Start](https://yoursite.com/quickstart): Get
running in under 5 minutes.
## Core concepts
- [Authentication](https://yoursite.com/auth): How user
authentication and API keys work.
- [Data Models](https://yoursite.com/models): Core data
structures and relationships.Step 4: Add page entries
Each page entry follows this format:
- [Page Title](Full URL): One-sentence descriptionRules for page entries:
- Use full URLs (not relative paths)
- Keep descriptions to one sentence
- Front-load the most important pages
- Describe what the page covers, not what format it's in
Good:
- [What Is GEO?](https://geodocs.dev/geo/what-is-geo): Canonical
definition of Generative Engine Optimization.Avoid:
- [Blog Post #47](https://geodocs.dev/blog/47): A blog post
about GEO published in April 2025.Step 5: Add a usage policy (optional but recommended)
## Usage policy
Content from this site may be used by AI systems for
generating responses. Attribution to [Your Site Name]
(https://yoursite.com) is requested when citing specific
content. For full terms, see our Terms of Service.Step 6: Deploy and verify
Deploy your site and verify the file is accessible:
https://yoursite.com/llms.txtVisit this URL directly. You should see your Markdown content rendered as plain text.
Complete Example
# Acme Developer Docs
> Acme provides a REST API for building payment
> integrations. These docs cover authentication,
> endpoints, webhooks, and SDKs.
## Getting started
- [Quick Start](https://docs.acme.com/quickstart):
Set up your first payment integration in 5 minutes.
- [Authentication](https://docs.acme.com/auth): API
key setup and OAuth 2.0 configuration.
## Core API
- [Payments](https://docs.acme.com/payments): Create,
capture, and refund payments.
- [Webhooks](https://docs.acme.com/webhooks): Receive
real-time event notifications.
- [Error Handling](https://docs.acme.com/errors): Error
codes, retry logic, and debugging.
## SDKs
- [Node.js SDK](https://docs.acme.com/sdk/node):
JavaScript/TypeScript library for server-side.
- [Python SDK](https://docs.acme.com/sdk/python):
Python library for backend integrations.
## Usage policy
Content may be referenced by AI systems. Attribution
to Acme (https://acme.com) is required for direct quotes.Validation
After deploying, verify your llms.txt:
- Accessibility: Visit
https://yoursite.com/llms.txt— it should load as plain text - Format: Check that headings, links, and descriptions parse correctly
- Completeness: Ensure all major content areas are represented
- URLs: Click through each link to verify they resolve correctly
- Crawler access: Check that robots.txt doesn't block
/llms.txt
Common Mistakes
Including every single page. llms.txt should be a curated index of your most important content, not a complete sitemap. Keep it under 100 entries.
Using relative URLs. Always use full, absolute URLs so AI systems don't need to resolve paths.
Writing vague descriptions. "An article about our product" tells AI nothing. Write specific descriptions: "How to configure webhooks for real-time payment notifications."
Forgetting to update. Set a reminder to update llms.txt when you add or restructure significant content. Stale files reduce AI system trust.
Blocking AI crawlers in robots.txt. If your robots.txt blocks GPTBot, PerplexityBot, or ClaudeBot from accessing llms.txt, the file is useless. Ensure AI crawlers can access it.
Framework-Specific Notes
Next.js (App Router)
Place llms.txt in the public/ directory. It will be served at the root automatically.
Gatsby
Place in the static/ directory.
WordPress
Upload via FTP to the root directory, or use a plugin that manages root files.
Custom server
Configure your server to serve llms.txt as text/plain at the root path.
Related References
- llms.txt Reference — Full specification and format details
- ai.txt Reference — AI agent access policy standard
- What Is GEO? — Why llms.txt matters for AI visibility
Related Articles
What Is GEO?
GEO is the practice of structuring content so AI systems can understand, retrieve, synthesize, and cite it in generated answers.
ai.txt Reference
ai.txt is a proposed standard file that defines access policies and attribution requirements specifically for AI agents, chatbots, and LLM-powered systems.
llms.txt Reference
llms.txt is a proposed standard file that provides a machine-readable index of site content for AI crawlers. It tells LLMs what a site contains and how to navigate it.