What is Agentforce for Developers
Takeaway: After a year using Agentforce for Developers in real enterprise environments, the takeaway is simple: it’s a Salesforce-native AI coding assistant that speeds up...
Your product pages are built for humans. A growing share of your buyers will never see them.
AI agents are already visiting ecommerce sites, reading product data, and shaping buying decisions before a human ever opens a browser. They don’t render your hero banners. They don’t follow your carefully designed UX flows. They parse your data, judge its quality, and decide whether to recommend you.
Most commerce teams are still optimising for human traffic. That’s the gap.
Agentic commerce is the shift from human-browsed buying to AI-mediated product discovery, where autonomous agents research, compare, and recommend products before a buyer ever visits your site.
Key takeaway: The data layer is now a competitive surface. If your competitors have cleaner, more structured product data, their products get recommended by AI agents. Yours don’t.
Agentic commerce is AI-driven buying where autonomous agents research, compare, and recommend products on behalf of a human buyer. Instead of a customer browsing your site, an AI agent evaluates your product data, checks it against competing options, and makes (or shapes) the purchase decision upstream.
This is different from chatbots or recommendation widgets. Those live on your site, inside your control. Agentic commerce happens before your site, often without your knowledge. The buyer’s first interaction with your catalogue might be an AI system reading your structured data and deciding whether you make the shortlist.
Every major interface shift in ecommerce – desktop to mobile, mobile to social, social to marketplace – changed where and how customers engaged. This one is different. The customer journey now starts before any human interaction, mediated by AI agents that curate, filter, and recommend products upstream.
The numbers back this up. Meta’s ExternalAgent – the crawler powering AI experiences across Facebook, Instagram, and WhatsApp – grew its share of global AI bot traffic from 8.5% to 11.6% in January 2026 alone. A 36% jump in thirty days. Meta is positioning itself as a primary discovery engine for billions of users, and your product catalogue is either ready for that or it isn’t.
These crawlers don’t match keywords. They convert product content into semantic representations – descriptions, attributes, technical specs, reviews – and map them into meaning. Traditional search matches “waterproof” to “waterproof.” AI discovery understands that “Gore-Tex shell designed for heavy rain” satisfies the same intent, even without the exact word.
If your data is fragmented or unstructured, AI can’t map your products accurately. The result isn’t a ranking penalty. You simply don’t get mentioned.
Traditional commerce projects focus on hero banners, UX flows, and conversion optimisation. In an AI-mediated world, structured product attributes determine whether AI systems can confidently reference your products at all.
Take a query like: “Find a commercial HVAC controller compatible with BACnet and Modbus under $500.” A marketing description won’t cut it. The AI can’t guess compatibility. It needs verified, discrete data fields to confirm the match.
This starts upstream in your PIM. Your Product Information Management platform has to serve as the structured source of truth – not just for humans, but for machines.
That means:
In B2C, AI evaluates contextual signals – customer sentiment, fit feedback, delivery experience. Reviews matter. Structured schemas for Review and AggregateRating are essential.
In B2B, the AI acts as a technical gatekeeper. It evaluates protocol compatibility, tier pricing, lead times, and industrial specs. If critical technical data is hidden behind “Request a Quote” forms with no publicly structured summary, your catalogue may never enter the decision set.
The impact of AI in retail goes beyond chatbots answering customer questions. AI agents are now the first touchpoint in the buying journey for a growing number of transactions. They scan catalogues, evaluate data quality, and curate shortlists before a human ever sees a product page.
This changes what “discoverable” means. A product that ranks well on Google might be invisible to an AI agent if its structured data is incomplete. A product with rich, machine-readable attributes might surface in AI recommendations even without strong organic search rankings.
For retailers and B2B suppliers alike, the implication is the same: your data layer is now a competitive surface. If your competitors have cleaner, more structured product data, their products get recommended. Yours don’t.
Rich structured data – particularly JSON-LD – creates a machine-readable data layer embedded directly in your product pages. This goes well beyond meta tags.
What it provides AI systems:
The less interpretation required, the higher the trust score.
Just as robots.txt communicates crawl permissions, llms.txt is an emerging convention for communicating context to large language models. A markdown file in your root directory that summarises your organisation, expertise, and product scope.
For large-scale commerce deployments, an expanded llms-full.txt can deliver deeper documentation optimised for AI ingestion. Think of it as a curated briefing document – a structured introduction that reduces ambiguity for any AI system trying to understand what you sell and why it should recommend you.
Adobe Commerce’s extensibility makes it well suited for this. Applying advanced SEO tools not just for Google compliance but for structured AI ingestion. Exposing structured data through JSON-LD. Connecting your PIM pipeline to machine-readable outputs. The platform supports it – the question is whether your implementation does.
If AI agents are becoming intermediaries, you need to measure their impact. GA4 allows traffic segmentation by user agent. Monitor visits from known AI crawlers (GPTBot, CCBot, Meta-ExternalAgent) and track which catalogue areas they index and revisit.
A pattern we see often: crawler activity increases but AI-referred sessions don’t. That signals a data quality problem. The AI found your content but didn’t consider it complete or reliable enough to reference.
Crawlable doesn’t mean recommendable.
Enterprise clients raise a valid concern: how do you enable AI visibility without exposing proprietary logic? For B2B organisations with negotiated pricing or sensitive technical IP, this needs governance.
Practical controls:
The goal is balance. Preserve performance for human visitors while staying discoverable for AI systems. Blocking AI crawlers outright is rarely the right move.
Five steps, in order:
AI-mediated commerce isn’t a future scenario. It’s already shaping buyer journeys. The question is whether your catalogue is part of that conversation or invisible to it.
Agentic commerce is AI-driven buying where autonomous agents research, compare, and recommend products on behalf of a human buyer. Instead of browsing a website, an AI agent evaluates product data, compares options, and shapes the purchase decision before a human is involved.
AI crawlers like GPTBot, CCBot, and Meta-ExternalAgent visit product pages and convert content into semantic representations. If your product data is structured and complete, AI systems can confidently recommend your products. If it isn’t, you don’t get a ranking penalty – you simply don’t get mentioned.
Traditional SEO optimises for search engine rankings based on keywords, backlinks, and page authority. AI ecommerce optimisation (sometimes called AEO) focuses on structured data quality, machine-readable attributes, and semantic clarity so that AI agents can accurately understand and recommend your products.
llms.txt is a markdown file placed in your site’s root directory that provides context to large language models. It summarises your organisation, expertise, and product scope – like a curated briefing document that helps AI systems understand what you sell and whether to recommend you.
Use GA4 to segment traffic by user agent. Monitor visits from known AI crawlers (GPTBot, CCBot, Meta-ExternalAgent) and track which catalogue areas they index. If crawler activity rises but AI-referred sessions don’t, that signals a data quality problem – the AI found your content but didn’t trust it enough to reference.
If you want to talk about where your commerce architecture sits, get in touch.