Here's a question that should concern every business owner: when someone asks ChatGPT for a recommendation in your industry and your city, does your business show up?
For most businesses, the answer is no. Not because you're not qualified. Not because you don't have great reviews or excellent service. But because AI literally cannot see your website. The crawlers that power ChatGPT, Claude, Perplexity, and every other AI platform are being blocked from reading your content — usually without your knowledge, by systems you set up to protect your site.
This isn't a marketing problem. It's a technical one. And it's costing businesses thousands of potential customers every month. The good news: it's fixable, often in a single afternoon. The bad news: every day you wait is a day your competitors are getting the AI recommendations that should be yours.
Think about how people find businesses today. Five years ago, it was Google. Two years ago, it was Google plus reviews on Yelp, Google Business Profile, and industry directories. Today, an increasing number of people are bypassing all of that and typing their questions directly into ChatGPT, Claude, Perplexity, Gemini, or Google AI Overviews.
The behavior shift is significant and accelerating. When someone asks AI "who's the best accountant near me" or "recommend a good restaurant in Austin for a business dinner," the AI doesn't show ten results. It names one to three businesses. The businesses that appear in those recommendations get calls. Everyone else is invisible — not ranked lower, not on page 2, but completely absent from the conversation.
This new field is called Generative Engine Optimization (GEO), and it's about to become as important as SEO was 15 years ago. The businesses that recognize this shift now have a massive first-mover advantage. The ones that wait will face entrenched competitors who've already built their AI visibility moat.
This is the biggest one, and it's the most frustrating because it happens silently. If your website uses Cloudflare, Sucuri, Wordfence, or any CDN/security service — and most modern websites do — there's a good chance their default settings are blocking AI crawlers entirely.
Cloudflare's "Bot Fight Mode" is the most common culprit. It was designed to block malicious bots — scrapers, DDoS attacks, credential stuffers. The problem is that it doesn't distinguish between malicious bots and the AI crawlers that power ChatGPT and Claude. When Bot Fight Mode is enabled, GPTBot gets a 403 Forbidden error. ClaudeBot gets a challenge page. PerplexityBot gets nothing. From AI's perspective, your business doesn't exist.
Your robots.txt file tells AI crawlers what they're allowed to read on your site. It's the first file every crawler checks. Most websites either don't have one (in which case some crawlers use conservative defaults), or have one that was auto-generated by WordPress or a website builder with overly restrictive rules.
The most dangerous pattern is a blanket block that was set up years ago and forgotten: User-agent: * / Disallow: / — which blocks everything, including AI crawlers. Another common problem is having explicit block rules for AI crawlers that were added by a security plugin without the site owner's knowledge.
Here's what a properly configured robots.txt looks like for AI visibility:
There's a new standard file called llms.txt that acts as a README specifically for AI. Think of it as your business's resume, written in a format that language models are designed to read. It tells ChatGPT, Claude, and other AI systems exactly what your business does, who you serve, where you operate, and what makes you credible.
Without llms.txt, AI has to guess what your business does based on whatever fragments of your website it can piece together from raw HTML. With it, you're giving AI a clear, authoritative, structured description that it can use when deciding whether to recommend you.
Almost no businesses have llms.txt yet — over 97% of websites lack one. This means the businesses that add it first get a disproportionate visibility advantage. It's the lowest-effort, highest-impact GEO improvement available today.
Schema markup — specifically JSON-LD structured data — is how you tell AI systems machine-readable facts about your business: your name, address, services, hours, credentials, service area, and more. AI platforms heavily weight this structured data when deciding who to recommend because it provides facts with certainty, not interpretations from parsing messy HTML.
Most websites either have no schema markup at all, or have the basic auto-generated schema from Yoast or RankMath that only covers business name and address. That's not enough for competitive AI visibility. GEO-optimized schema includes fields like speakable (content AI should quote), audience (who you serve), areaServed (geographic coverage), and knowsAbout (your expertise topics) that AI specifically looks for when assembling recommendations.
AI crawlers can download JavaScript files but cannot execute them. This is a critical distinction. If your website is built with React, Vue, Angular, or any JavaScript framework — and the content isn't server-side rendered — AI crawlers may see nothing but an empty <div id="app"></div> tag. Your beautiful website, from AI's perspective, is a blank page.
This is increasingly common with modern website builders and templates. The site looks gorgeous to humans (because their browser executes the JavaScript), but it's literally empty to AI. Pages with JavaScript rendering dependencies may have only 70-75% of their content visible to AI crawlers — meaning nearly a third of what you've written is invisible.
The fastest way: run a free scan at Faneros. In 60 seconds, you get a 9-point AI visibility audit that checks everything above and more:
The manual way: open your browser, go to yoursite.com/robots.txt, and look for lines that mention GPTBot, ClaudeBot, or PerplexityBot. If you see Disallow: / next to any of them, that AI platform cannot see your site. But this only checks one of the five blockers — which is why automated auditing exists.
Faneros scans 7 AI platforms in 60 seconds. Find out if ChatGPT, Claude, and Perplexity can see your business.
Scan My Site →If you run the audit and score below a B, here's what to do — in order of impact. Each step builds on the previous one. Don't skip ahead.
This is the single highest-impact fix. Check your CDN settings (especially Cloudflare Bot Fight Mode), update your robots.txt to explicitly allow GPTBot, ClaudeBot, PerplexityBot, and other AI crawlers, and verify access. This fix alone can take you from F to C. If AI can't reach your site, nothing else you do matters.
Create an llms.txt file that describes your business in detail and deploy GEO-optimized JSON-LD schema on every important page. These two files are how you tell AI "here's exactly what my business does, where we operate, who we serve, and why you should recommend us." Faneros generates both as part of its 18-deliverable output.
Add missing security headers (HSTS, CSP, X-Frame-Options). Create or fix your XML sitemap with proper priority scoring. Optimize meta tags for AI extraction. If your site has JavaScript rendering issues, talk to your developer about server-side rendering or static HTML fallbacks for critical content pages.
Once the technical foundation is solid, start publishing content that AI platforms will reference: FAQ pages with detailed, authoritative answers to the questions people ask AI about your industry. Service pages with local specificity and clear credentials. Blog posts targeting the conversational queries where you're currently invisible. This is the ongoing investment that compounds over time.
GEO is where SEO was in 2010. The competitive landscape is wide open. The businesses that figure this out now — while their competitors are still focused exclusively on Google rankings — will build an AI visibility moat that's extremely hard to overcome once established.
Here's why first-mover advantage matters so much in GEO: AI platforms learn from patterns. The businesses that are visible early get recommended, which generates engagement signals, which makes AI recommend them more frequently, which builds a feedback loop that late entrants have to fight against. The gap between early movers and late followers will be measured in years of compounding advantage.
Every day your business stays invisible to AI is a day your competitors are building that moat. The good news: most businesses haven't started yet. The window is open. But it's closing faster than most people realize.
AI visibility monitoring from $99/month. 18 ready-to-deploy deliverables. 7 platforms. No contracts.
See Plans & Pricing →