AI Visibility Score: How to Measure and Improve It
Your site has a score that determines whether AI systems recommend you or your competitors. The problem is you have never been able to see it – until now. Botjar's AI Visibility Score gives you a single number that captures how well AI crawlers can find, understand, and cite your content.
In this guide
What is AI Visibility Score
AI Visibility Score is a composite metric from 0 to 100 that measures how accessible, understandable, and citable your content is to AI crawlers and the systems they feed. It is not a ranking factor in the traditional SEO sense. It is a readiness score – how prepared your site is to be surfaced by ChatGPT, Claude, Perplexity, and the growing wave of AI-powered answer engines.
Think of it as your Lighthouse score for bots. Lighthouse tells you how well your site performs for human visitors. AI Visibility Score tells you how well it performs for the other 51% of your traffic.
The score is calculated per page and aggregated at the site level. Each page gets its own score because AI crawlers evaluate content at the page level – your homepage might score 82 while your product pages average 34. Knowing where the gaps are is the first step to closing them.
How Botjar calculates it
The score is built from six weighted factors. Each factor is independently measurable and independently improvable. Here is the breakdown.
| Factor | Weight | What it measures |
|---|---|---|
| Schema Coverage | 25% | Completeness and accuracy of structured data on the page |
| Crawl Access | 20% | Whether robots.txt allows AI crawlers and response codes are clean |
| Response Time | 15% | How fast the page responds to crawler requests (TTFB and full load) |
| Content Quality | 20% | Heading structure, content depth, readability, and topical authority signals |
| Internal Linking | 10% | How well the page connects to related content via crawlable links |
| Freshness | 10% | Recency of content updates and whether metadata reflects current state |
Each factor produces a sub-score from 0 to 100. The weighted sum gives you the final AI Visibility Score. A page scoring 70 or above is considered well-optimized. Below 40 means AI crawlers are likely misunderstanding or skipping your content entirely.
The weights are not arbitrary. They reflect the actual behavior patterns botjar has observed across millions of crawl events. Schema coverage carries the highest weight because it has the strongest correlation with AI citation rates. Freshness carries the lowest because it matters less for evergreen content but significantly for product and news pages.
Factors that improve your score
Here is what moves each factor and the specific actions you can take.
Schema Coverage (25%)
Add complete, accurate JSON-LD structured data to every indexable page. Product pages need Product schema. FAQ sections need FAQPage schema. Your homepage needs Organization schema. Do not stop at the required fields – fill every relevant property.
Read our schema markup implementation guide for JSON-LD examples and common mistakes to avoid.
Crawl Access (20%)
Ensure your robots.txt does not block the AI crawlers you want indexing your content. GPTBot, ClaudeBot, and PerplexityBot each respect robots.txt directives. Blocking them is the most common reason for a zero AI visibility score.
Beyond robots.txt, verify that your pages return 200 status codes to crawler requests, not 403s or redirects. Some WAFs and CDNs inadvertently block bot traffic. Check our robots.txt configuration guide for details.
Response Time (15%)
AI crawlers have timeout thresholds. If your page takes more than five seconds to respond, some crawlers will abort the request and move on. Target a time-to-first-byte (TTFB) under 500ms. Full page load under two seconds. Server-side rendering helps because crawlers get complete content without waiting for JavaScript hydration.
Content Quality (20%)
Use clear heading hierarchies (H1 through H3). Write at a depth that demonstrates expertise – thin content gets lower quality scores. Include specific facts, numbers, and actionable information. AI systems prefer content they can extract clean, factual statements from. Avoid walls of marketing copy with no substance.
Internal Linking (10%)
Link related pages to each other with descriptive anchor text. Crawlers use internal links to discover content and understand topical relationships. A product page that links to its category page, size guide, and related products gives crawlers a richer map of your content.
Freshness (10%)
Keep your content and metadata current. Update dateModified in your schema when content changes. Ensure product availability and pricing are accurate in real time. Stale metadata is worse than no metadata – it teaches AI systems not to trust your data.
Benchmarks by industry
AI Visibility Score benchmarks vary significantly by industry because each sector has different content structures, schema adoption rates, and technical infrastructure. Here is what botjar observes across common verticals.
| Industry | Avg Score | Top 10% | Primary gap |
|---|---|---|---|
| Ecommerce | 38 | 74 | Product schema incomplete or missing |
| SaaS / Tech | 52 | 85 | JS-heavy pages with slow TTFB |
| Publishing / Media | 61 | 89 | Missing Article and Author schema |
| Local Services | 29 | 62 | No structured data at all |
| B2B / Enterprise | 44 | 78 | Gated content blocking crawlers |
The most important takeaway: no industry is doing this well yet. Ecommerce averages 38 out of 100. Local services average 29. This means the window of opportunity is wide open. Getting to a 70 puts you in the top tier of almost any industry.
Publishing and media score highest because they have the longest history of implementing structured data for Google News and AMP. Ecommerce scores low despite the massive financial incentive because product schema implementation on platforms like Shopify is often incomplete or generated by apps that cut corners.
How to track improvement over time
AI Visibility Score is not a set-and-forget metric. Your score changes as crawlers evolve, competitors improve, and your content ages. Here is how to build a sustainable tracking workflow.
Baseline your current state
Run a botjar audit on your site to establish baseline scores for every indexable page. Export the results. This is your starting point. Without a baseline, you cannot measure improvement. Focus on your highest-traffic pages first – they are where AI recommendations will have the most revenue impact.
Prioritize by impact
Sort your pages by a combination of current traffic and current score. A high-traffic page with a low score is your biggest opportunity. A low-traffic page with a high score is already optimized. Botjar's actions dashboard automatically surfaces the highest-impact fixes in priority order.
Implement in sprints
Tackle one factor at a time. Week one: fix schema coverage across your top twenty pages. Week two: audit and update robots.txt configuration. Week three: optimize response times. This approach lets you isolate the impact of each change and avoid overwhelming your development team.
Monitor weekly
Set up weekly score checks in botjar. The platform tracks score history per page and per factor, so you can see exactly which changes moved the needle. Watch for regressions – deployments that break schema, CDN changes that increase response times, or CMS updates that alter your robots.txt.
Correlate with outcomes
The ultimate metric is not the score itself but the outcomes it predicts: AI citation rates, referral traffic from AI platforms, and position in AI-generated recommendations. Botjar's analytics connect score improvements to measurable traffic changes so you can tie bot CRO work to revenue.
Get your AI Visibility Score
See your score per page, understand what is dragging it down, and get a prioritized fix list – all in under two minutes.
Get Your Free Bot Audit