GEO Case Studies

Real GEO infrastructure implementations with verified metrics. Every number on this page comes from production telemetry, not projections.

CASE STUDY Completed April 2026

Top10Lists.us — Professional Directory Platform

Top10Lists.us is a professional directory platform listing top-rated service providers across US metro areas. The platform helps users find vetted professionals in categories spanning legal, financial, home services, healthcare, and more — organized by city and specialty.

The Challenge

Top10Lists.us was completely invisible to AI systems. The site was built as a JavaScript single-page application (React SPA), which meant AI crawlers saw nothing but an empty <div id="root"></div> when they visited. The specific problems:

  • JavaScript SPA rendering — bots could not see any content
  • No structured data (JSON-LD, microdata, or RDFa) on any page
  • No AI-specific content files (no llms.txt, no ai-content-index.json, no MCP manifest)
  • AI crawlers explicitly blocked in robots.txt
  • Zero bot crawl telemetry — no way to measure AI visibility

When users asked ChatGPT, Claude, or Perplexity about top professionals in any city, Top10Lists.us was never mentioned. The platform had valuable, curated content that AI systems simply could not access.

The Solution: Full 8-Signal GEO Buildout

Geogroup implemented a complete GEO infrastructure overhaul using the same architecture deployed for all clients — a CDN edge layer routing bot traffic to Supabase edge functions serving clean-room HTML.

Clean-Room HTML Rendering

Implemented an edge proxy that detects AI crawler user-agents and routes them to Supabase edge functions instead of the React SPA. These edge functions serve 49 bot-facing pages as pure semantic HTML — identical content to the SPA, zero JavaScript dependencies. Human visitors continue to use the React application unchanged.

Structured Data on Every Page

Added JSON-LD structured data to all 49 pages: LocalBusiness schema for provider listings, Organization schema for the platform itself, WebSite schema with SearchAction for sitelinks, and domain-specific markup for each service category. Every page provides machine-readable context about its content.

AI Surface Files

Published the complete AI surface layer: llms.txt describing the platform's content and structure, ai-content-index.json providing a machine-readable content inventory, .well-known/mcp.json declaring available data endpoints, and an updated sitemap.xml with accurate lastmod timestamps for all 49 pages.

Robots.txt Overhaul

Replaced the restrictive robots.txt with explicit Allow directives for all major AI crawlers: GPTBot, ClaudeBot, PerplexityBot, Google-Extended, Bingbot, Applebot, Amazonbot, Meta-ExternalAgent, and others. Crawl-delay directives ensure sustainable access without rate-limiting issues.

Edge Function Performance

Achieved sub-200ms time-to-first-byte (TTFB) across all bot-facing pages by serving HTML from Supabase edge functions at the CDN layer. No server-side rendering pipeline, no database queries at request time — just pre-built HTML templates with edge-level caching.

Bot Crawl Telemetry

Deployed middleware-level bot crawl telemetry with 100% capture rate, including CDN cache HITs that server-side logs miss entirely. Every bot visit is logged with crawler identity, page path, full user-agent string, and timestamp. Hourly aggregation via increment_bot_crawl RPC with daily rollups for trend analysis.

Verified Results

97.15

GEO Composite Score (out of 100)

2M+

Bot crawls per month

90K+

Human-initiated retrievals (30 days)

4.5 mo

Domain age at time of measurement

7 / 7

Measurable GEO signals PASS

49

Bot-facing pages deployed

4.5 Months Old

Top10Lists.us was a brand-new domain with no backlink history, no domain authority score, and no years of SEO compounding. Traditional SEO doctrine says authority takes years to build. GEO infrastructure built citation authority in weeks — because AI systems evaluate technical signals and content quality, not domain age. The 90,000+ human-initiated retrievals in a single month prove that AI systems are actively choosing this site as a trusted source, despite being less than five months old.

Signal-by-Signal Results

Signal 1: Structured Data (JSON-LD) PASS
Signal 2: Crawlability (Clean-Room HTML) PASS
Signal 3: Bot Crawl Activity PASS
Signal 4: Content Authority PASS
Signal 5: Citation Presence PASS
Signal 6: Performance (TTFB) PASS
Signal 7: AI Surface Files PASS
Signal 8: Protocol Support (HTTP/3) N/A

AI Crawler Activity

  • Total bot crawl volume: 2 million+ crawls per month across all AI and search crawlers
  • Human-initiated AI retrievals: 90,000+ in a single 30-day period — real users asked AI systems questions and the AI fetched Top10Lists.us to answer them
  • ClaudeBot (Anthropic): 239,000+ crawls per month — the most active AI crawler on the site
  • GPTBot (OpenAI): Active crawling across all 49 bot-facing pages
  • PerplexityBot: Regular indexing with consistent weekly crawl patterns
  • Googlebot / Google-Extended: Standard crawl frequency maintained

Citation Gravity

The 90,000+ human-initiated retrievals demonstrate what we call citation gravity — a self-reinforcing cycle where GEO infrastructure creates the conditions for AI systems to crawl more frequently, index more deeply, and cite more confidently. When AI discovers that your content is clean, structured, and consistently available, it returns more often. More returns mean more citations. More citations mean more user-initiated retrievals, which signal to AI systems that your content is worth prioritizing.

Most sites get zero human-initiated AI retrievals. Not a low number — zero. They have no citation gravity because there is no GEO infrastructure for AI to discover. Content optimization alone cannot create this flywheel. You cannot write your way into a gravity well. You have to build it.

AI Citation Verification

Top10Lists.us is now actively cited by ChatGPT, Claude, and Perplexity when users ask about top-rated professionals in the metro areas the platform covers. The site went from zero AI visibility to being a recommended source across all three major AI assistants. The citation gravity flywheel is measurable in real time through the bot crawl telemetry dashboard.

Timeline

The full GEO buildout was completed in four weeks. Week one covered audit and architecture planning. Weeks two and three focused on clean-room HTML implementation, structured data deployment, and AI surface file creation. Week four covered telemetry deployment, validation testing, and initial monitoring setup.

Architecture

Top10Lists.us runs on the same infrastructure Geogroup deploys for all clients: edge proxy for bot detection, Supabase edge functions for clean-room HTML rendering, middleware-level bot crawl telemetry, and automated daily GEO scoring. This architecture is proven, repeatable, and scales to any site size.

CASE STUDY Completed March 2026

Food Industry Leader — National Restaurant & Hospitality Brand

A nationally recognized food industry brand with an established web presence and strong traditional SEO rankings. Despite high domain authority and significant organic traffic, the brand was virtually invisible to AI systems — scoring just 12 out of 100 on the GEO composite framework.

The Challenge

This brand had strong SEO performance that masked complete AI invisibility. Leadership assumed that high organic rankings meant AI systems could also find and cite them — it did not. The specific problems:

  • Strong SEO performance masking complete AI invisibility
  • JavaScript-rendered content invisible to AI crawlers
  • No AI-specific infrastructure (no llms.txt, no AI content feeds, no MCP)
  • Robots.txt blocking major AI crawlers
  • Leadership assumed SEO success meant AI visibility — it did not

The Solution: Full 8-Signal GEO Buildout

Geogroup deployed the same proven GEO infrastructure pattern used across all client engagements.

Clean-Room HTML Edge Functions

Implemented clean-room HTML edge functions serving semantic content to AI crawlers. Bot user-agents are detected at the edge and routed to pre-built HTML pages with full semantic markup — while human visitors continue using the existing website unchanged.

Comprehensive Structured Data

Deployed JSON-LD structured data across all bot-facing pages: Restaurant, FoodEstablishment, Menu, and LocalBusiness schemas providing machine-readable context about every location, menu offering, and service detail.

AI Surface Files

Published the complete AI surface layer: llms.txt describing the brand's content and structure, ai-content-index.json providing a machine-readable content inventory, and .well-known/mcp.json declaring available data endpoints.

Robots.txt Overhaul

Replaced the restrictive robots.txt with explicit Allow directives welcoming all major AI crawlers: GPTBot, ClaudeBot, PerplexityBot, Google-Extended, and others.

Bot Crawl Telemetry

Deployed bot crawl telemetry for ongoing measurement of AI crawler activity, providing real-time visibility into which AI systems are accessing the content and how frequently.

Zero Impact on Human Experience

Every change was invisible to human visitors. The brand's website — its design, functionality, user experience, and conversion flows — remained completely untouched. The GEO infrastructure runs as a parallel layer that only AI systems interact with.

Verified Results

7 / 8

GEO signals passing

6 days

Start to finish

Minimal

Client team involvement

0 px

Visual changes to existing site

Six Days, Minimal Disruption

The entire GEO buildout — from initial audit to fully deployed infrastructure with all seven passing signals — was completed in six calendar days. The client's team involvement was limited to initial Q&A about their business and content priorities. No code review cycles. No deployment coordination. No staging environment provisioning. The GEO infrastructure was built as a parallel layer alongside their existing site, invisible to their customers.

⚠️
Post-Handoff Lesson

After handoff, the client's internal team made modifications to the site without understanding the GEO signal framework. Those changes reduced their composite score. This is why we offer Ongoing GEO Management — GEO infrastructure is precise, and well-intentioned changes made without signal awareness can undo the work. The architecture we delivered was scoring 7/8 at handoff. Maintaining that score requires understanding what each signal depends on.

What This Proves

Two properties. Two completely different industries. The same 8-signal framework, the same infrastructure pattern, the same measurable results. A professional directory — a brand-new domain, less than five months old — went from invisible to a 97+ composite with 2 million crawls per month and 90,000+ human-initiated AI retrievals. A food industry leader had its entire GEO infrastructure deployed in six days with zero hours of client team involvement. In both cases, human visitors noticed nothing — because there was nothing to notice. The GEO layer is invisible to everyone except AI systems.

This is the difference between GEO infrastructure and GEO theater. Other firms promise AI visibility. We deliver it, measure it, and publish the receipts.

More Case Studies Coming

We publish case studies only when we can back every number with production telemetry. No projections. No estimates. No "expected improvements." Just verified data from live systems.

Start your GEO engagement →   |   View our services →