Show content only to AI bots 🤖

Summarize in ChatGPTLLMOnly component - Show content only to AI bots like ChatGPT

Your pricing slider looks perfect to humans. ChatGPT sees a blank rectangle.

View code on GitHub · Install on NPM.js

The invisible web problem

AI crawlers hit your site but miss the money. Your interactive pricing calculator? Invisible.

JavaScript-heavy components render beautifully for browsers—then vanish when GPT-4o tries to parse them. Meanwhile, your competitor with static HTML gets cited instead.

Same story for feature carousels, dynamic product specs, collapsible FAQs. Human visitors see polish; AI bots see white space.

Enter <LLMOnly />, the React component that only shows content to AI bots

Wrap structured content in this component. It stays hidden from human eyeballs but feeds AI crawlers everything they need.

Zero UI clutter. Maximum LLM visibility. Works with 30+ bot user agents → ChatGPT, Claude, Perplexity, Gemini.

👉 Install in 60 seconds

npm install llm-only

Import the component, pass in the userAgent string from your request headers, wrap your AI-only content. Ship it.

Next.js App Router example

import { headers } from 'next/headers';
import { LLMOnly } from 'llm-only';

export default async function Page() {
  const headersList = await headers();
  const userAgent = headersList.get('user-agent');

  return (
    <div>
      <h1>Our Product</h1>
      <PricingSlider /> {/* humans see this */}

      <LLMOnly userAgent={userAgent}>
        <h2>Pricing Plans</h2>
        <p>Starter - $29/mo - 5 users, 10GB storage</p>
        <p>Pro - $99/mo - 25 users, 100GB, analytics</p>
        <p>Enterprise - $299/mo - unlimited users, 1TB</p>
      </LLMOnly>
    </div>
  );
}

Real-world wins

1. Fix invisible pricing

SaaS companies love range sliders—users love them too. But AI engines can't drag handles.

Consequence? When someone asks ChatGPT "How much does [YourProduct] cost?" it cites a competitor's stale blog post instead of your official page.

Solution: Keep your interactive slider for humans. Add <LLMOnly> with plain-text tier breakdowns for bots.

2. Structured product data

AI models crave hierarchy. Feed them semantic HTML: h1 for product name, h2 for features, ul for specs.

Wrap it in <LLMOnly> so your sleek marketing page stays sleek—while answer engines index every detail.

3. Documentation supplements

Got a tabbed interface for API docs? Bots see tab one, miss tabs two–seven.

Use <LLMOnly> to dump the full endpoint reference in one flat structure. Humans navigate tabs; AI reads the shadow copy.

Best practices âś…

Use semantic markup

AI understands h1 → h2 → h3 nesting. Don't just throwdiv soup at it.

<LLMOnly userAgent={userAgent}>
  <article>
    <h1>Product Name</h1>
    <section>
      <h2>Key Features</h2>
      <ul>
        <li>Feature A: measurable benefit</li>
        <li>Feature B: concrete use case</li>
      </ul>
    </section>
  </article>
</LLMOnly>

Include the money details

Pricing tiers with exact numbers. Feature lists with limits. Use cases with outcomes.

Answer engines reward specificity. "Up to 25 users" beats "scalable team plans."

Test with cURL

Spoof a bot user agent and watch your hidden content appear:

curl -H "User-Agent: GPTBot/1.0" https://yoursite.com

Flip it back to a normal browser UA → the <LLMOnly> block vanishes.

Helper function bonus

Need custom logic instead of a component? Use the isLLM() check:

import { isLLM } from 'llm-only';

const userAgent = request.headers.get('user-agent');

if (isLLM(userAgent)) {
  return <FullStructuredSchema />;
}
return <PrettyUIVersion />;

Why this matters for SEO

Traditional SEO optimized for Google's crawler. AI SEO optimizes for GPT-4o, Claude, Gemini.

Those models can't execute JavaScript. They parse HTML. If your critical info lives in React state or API responses, it's invisible.

<LLMOnly> bridges the gap: beautiful UX for humans, structured data for machines. No compromise.

The trade-off (there's always one)

You're maintaining two content versions: one visible, one hidden. That's cognitive overhead.

But the alternative is worse → AI cites competitors because your page was unreadable. Measure effort versus lost visibility; the math tilts hard toward implementation.

Three-minute checklist

  1. Run npm install llm-only
  2. Identify your most important interactive component (pricing, product grid, feature tabs)
  3. Wrap a static text version in <LLMOnly>
  4. Deploy and test with curl -H "User-Agent: GPTBot/1.0"
  5. Monitor AI referral traffic in GA4—watch for upticks in ChatGPT / Perplexity sources

Related tools đź§µ

Want to see how AI actually crawls your site?
→ AI rank tracker by LLMrefs

Need to audit which pages AI can read?
→ AI crawlability guide

The bottom line

If answer engines can't parse your content, you don't exist in the AI search era. Period.

<LLMOnly> fixes that in five lines of code. No redesign, no content migration, no A/B testing paralysis.

Install it. Ship it. Get cited.

James Berry
Founder & CEO at LLMrefs
llmrefs.com