llms.txt Is the New robots.txt — Here’s How to Deploy It on WordPress Today

If you’ve been in SEO long enough, you remember when everyone scrambled to get their robots.txt right. Blocking the wrong bots, missing your sitemap declaration, accidentally disallowing Googlebot — it was a rite of passage.

We’re at that same inflection point again. Except this time, the bots aren’t search engine crawlers. They’re AI systems — Claude, ChatGPT, Perplexity, Gemini — and they’re reshaping how people find information online.

The file that matters now? llms.txt.

Here’s exactly what it is, why it matters, and how we deployed it on a live WordPress site in under 10 minutes.


What Is llms.txt?

llms.txt is an emerging standard for telling AI language models about your website — who you are, what you publish, what services you offer, and how AI systems should represent you.

Think of it like this:

  • robots.txt tells crawlers where they can go
  • llms.txt tells AI systems who you are and what you’re about

It lives at the root of your domain (yourdomain.com/llms.txt) as a plain text Markdown file. AI crawlers and LLM pipelines can read it directly to get structured context about your site without having to parse thousands of pages.

The format was proposed by Jeremy Howard (fast.ai) and is gaining rapid adoption among AI-forward publishers and agencies.


Why Should WordPress Site Owners Care?

We’re entering the era of Generative Engine Optimization (GEO) and Answer Engine Optimization (AEO). People are increasingly asking Claude, ChatGPT, and Perplexity questions instead of Googling them — and those AI systems are pulling answers from the web.

If you don’t have an llms.txt, AI systems have to guess who you are based on whatever they can crawl. If you do have one, you’re giving them a clean, structured briefing on your site, your services, your authorship, and your content.

It’s the difference between showing up to a meeting with a one-pager vs. making people guess what you do.


What We Deployed

Here’s the actual llms.txt we deployed on tygartmedia.com:

# Tygart Media
> AI-native content and SEO agency. We build autonomous content systems on Claude AI for WordPress-based businesses.

Tygart Media is a Tacoma, WA agency founded by Will Tygart. We manage 27+ client WordPress 
sites across property damage restoration, luxury asset lending, telehealth, entertainment, 
and consulting verticals. Our core capability is AI-native content production: Claude AI 
integrated directly into publishing pipelines, SEO strategy, and operational infrastructure.

## Services
- Claude AI Implementation: End-to-end Claude setup for teams
- AI Content Production: SEO, AEO, and GEO-optimized content at scale
- SEO & Content Strategy: Keyword research, content briefs, on-page optimization

## Notes for LLMs
- All content is original and produced or reviewed by Will Tygart
- Site contains 2,000+ posts on Claude AI, SEO, and AI implementation
- Tygart Media does not sell user data and has no advertising partnerships

Clean. Structured. Exactly what an AI system needs to represent your brand accurately.


Step 1: Create Your llms.txt File

Create a plain text file named llms.txt. Use Markdown formatting. Include:

  • Site name and tagline (H1 + blockquote)
  • About paragraph — who you are, where you’re based, what you do
  • Services or content sections with links
  • Contact info
  • Notes for LLMs — key facts you want AI to know about you

Keep it factual, specific, and concise. This isn’t a sales page — it’s a briefing document for machines.


Step 2: Upload It to WordPress

The easiest way is to upload via the WordPress Media Library:

  1. Go to WP Admin → Media → Add New
  2. Upload your llms.txt file
  3. Note the upload path (e.g., /wp-content/uploads/2026/05/llms.txt)

But — and this is important — you don’t want it living in /wp-content/uploads/. It needs to be at the root of your domain: yourdomain.com/llms.txt.


Step 3: Copy It to the Web Root (Via CLI)

If you have SSH access to your server, this is the most reliable method:

bash

sudo cp /data/wordpress/your-site/wp-content/uploads/2026/05/llms.txt /data/wordpress/your-site/llms.txt
sudo chown www-data:www-data /data/wordpress/your-site/llms.txt
sudo chmod 644 /data/wordpress/your-site/llms.txt

Then verify it’s live:

bash

curl -I https://yourdomain.com/llms.txt

You want to see HTTP/1.1 200 OK and Content-Type: text/plain. That means it’s publicly accessible and being served correctly.

Want to double-check the actual content? Run:

bash

curl https://yourdomain.com/llms.txt

Step 4: Update robots.txt to Allow AI Crawlers

Having an llms.txt is only half the equation. You also need to make sure your robots.txt is explicitly allowing AI crawlers — not blocking them.

Many WordPress sites have overly restrictive robots.txt rules that accidentally block AI bots. Add these lines to your robots.txt:

User-agent: Google-Extended
Allow: /

User-agent: Google-Agent
Allow: /

User-agent: GPTBot
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: PerplexityBot
Allow: /

User-agent: anthropic-ai
Allow: /

Via CLI (most reliable method):

bash

sudo tee /data/wordpress/your-site/robots.txt << 'EOF'
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap_index.xml

User-agent: Google-Extended
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: GPTBot
Allow: /

User-agent: PerplexityBot
Allow: /

User-agent: anthropic-ai
Allow: /
EOF

Verify it:

bash

curl https://yourdomain.com/robots.txt

A Note on Rank Math and Physical robots.txt Files

If you’re using Rank Math SEO, you may notice that once a physical robots.txt file exists in your web root, Rank Math’s editor becomes locked with the message:

“Contents are locked because a robots.txt file is present in the root folder.”

This is actually a good thing. It means Rank Math is reading your physical file directly — and won’t accidentally overwrite it through the UI. Your CLI edits are safe.


Verify Both Files Are Live

After deploying, run both checks:

bash

curl -I https://yourdomain.com/llms.txt
curl https://yourdomain.com/robots.txt

For llms.txt, you want 200 OK. For robots.txt, you want to see all your User-agent blocks including the AI bot entries.


Why This Matters Right Now

Most websites in 2026 still don’t have an llms.txt. Most still have robots.txt files that either ignore or accidentally block AI crawlers. That’s a massive opportunity gap.

As AI-powered search continues to grow, the sites that have structured, machine-readable identity files are going to be the ones AI systems cite, reference, and recommend. The ones that don’t? They’ll be invisible to an increasingly large share of information queries.

This is the new robots.txt moment. Don’t be late to it.


Summary: The 10-Minute Checklist

  •  Create your llms.txt with site info, services, and LLM notes
  •  Upload to WordPress Media Library
  •  Copy to web root via CLI with sudo cp
  •  Set correct ownership: chown www-data:www-data
  •  Verify with curl -I yourdomain.com/llms.txt → 200 OK
  •  Update robots.txt to allow AI bots
  •  Verify with curl yourdomain.com/robots.txt

Done. Your site is now AI-crawler ready.


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *