Agentic Commerce|8 min read

Preparing Your Store for AI-Powered Product Discovery

Botjar Team|

The Discovery Shift Is Already Here

Product discovery is changing. Instead of typing keywords into Google, a growing number of shoppers describe what they need to an AI assistant and receive curated recommendations. "Find me a waterproof jacket for hiking in the Pacific Northwest, under $200" gets a personalized answer with specific product recommendations – no browsing required.

This shift means your products need to be discoverable by AI systems, not just search engines. The good news: preparing for AI-powered discovery is largely about executing fundamentals well. The bad news: most ecommerce stores are not executing the fundamentals well enough.

Step 1: Audit Your Current AI Visibility

Before optimizing, assess where you stand. Check these five areas:

Robots.txt Configuration

Open your robots.txt file and check whether major AI crawlers (GPTBot, ClaudeBot, PerplexityBot) are blocked. If they are, your products cannot be indexed for AI recommendations. This is the single most impactful check you can make.

Schema Markup Coverage

Spot-check 10 product pages across different categories. How many have complete Product schema with name, description, price, availability, and reviews? If fewer than 8 out of 10, you have a coverage gap.

Server Response to Bots

Test how your server responds to bot user agents. Some ecommerce platforms serve different content (or block entirely) based on user agent strings. Verify that AI crawlers receive the same content as Googlebot.

Content Depth

Read your top 5 product descriptions as if you could not see the images. Do they explain what the product is, who it is for, and why it is worth buying? If the description only makes sense alongside photos, it fails for AI crawlers that process text only.

Review Quality

Check your customer reviews. Are they genuine, detailed, and recent? AI assistants cite specific review feedback when making recommendations. Generic or sparse reviews reduce your recommendation probability.

Step 2: Fix Your Robots.txt

If you found AI crawlers are blocked in step 1, fix this first. It is the highest-impact change with the lowest effort:

  • Remove any Disallow: / rules for GPTBot, ClaudeBot, ChatGPT-User, and PerplexityBot
  • Allow access to product pages, category pages, and content pages
  • Keep blocking sensitive areas: cart, checkout, account, admin
  • Add your sitemap URL so crawlers can discover your full product catalog

After updating robots.txt, changes take effect within 24-48 hours as crawlers re-fetch the file.

Step 3: Upgrade Your Schema Markup

Focus on your top products first – the 20% of products that drive 80% of revenue:

  • Ensure every product page has JSON-LD Product schema
  • Include at minimum: name, description, offers (price, currency, availability), aggregateRating, brand, image, and sku
  • Add individual review objects with author, rating, date, and review text
  • Include additionalProperty for key specifications (weight, dimensions, material, etc.)
  • Validate all schema with Google's Rich Results Test

Expanding schema across your full catalog can be done incrementally. Start with your top products and expand over time.

Step 4: Enrich Product Content

AI crawlers need substantive text to understand your products. For each priority product page:

  • Expand descriptions to 200+ words covering features, use cases, materials, and ideal customer profiles
  • Add comparison context – how does this product compare to previous versions or competitors?
  • Include use-case specifics – "ideal for trail running on wet surfaces" is more valuable than "great for outdoor activities"
  • Structure with headings – proper H2/H3 hierarchy helps crawlers parse content sections
  • Add FAQ content – common questions about the product with detailed answers, wrapped in FAQPage schema

Step 5: Optimize Server Performance for Bots

AI crawlers evaluate your site's technical performance:

  • Target TTFB under 200ms for bot requests
  • Verify bot user agents are not rate-limited unnecessarily by your CDN or WAF
  • Ensure product data is in the initial HTML – not loaded via JavaScript after page load
  • Return proper HTTP status codes – 200 for live products, 404 for discontinued products, 301 for moved products
  • Keep server uptime above 99.9% – intermittent errors during crawling lead to crawl deprioritization

Step 6: Set Up Monitoring

Optimization without monitoring is guesswork. Establish ongoing tracking for:

  • AI crawler activity: which crawlers visit, how often, which pages they access
  • AI referral traffic: traffic from chat.openai.com, claude.ai, perplexity.ai, and Google AI Overviews
  • AI Visibility Score: a per-page score reflecting how well each page is optimized for AI consumption
  • Schema validation: continuous checks for broken or invalid schema markup
  • Response time to bots: server performance specifically for bot user agents

Botjar provides all of these metrics in a single dashboard, updated in real-time. Track your AI Visibility Score across every page and see exactly where to focus your optimization efforts.

The Preparation Checklist

Summarizing the action items in priority order:

  • Unblock AI crawlers in robots.txt (immediate impact)
  • Add or complete Product schema on your top 20 products (1-2 weeks)
  • Enrich product descriptions on priority pages (2-4 weeks)
  • Add FAQ schema to product and category pages (1-2 weeks)
  • Verify server performance for bot requests (1 day)
  • Set up monitoring with botjar (10 minutes)

Start your AI discovery preparation today. Botjar gives you a complete AI readiness audit in 60 seconds – robots.txt analysis, schema coverage, crawler activity, and an AI Visibility Score for every page. Get your free bot audit →

More from the blog

botjar

Scanning visitor...