Read-only · No GA4 data stored
ChatGPT, Gemini, Claude, Perplexity — AI referral traffic is real and it's growing. Most GA4 setups can't see it clearly. This report breaks it down by platform, engagement rate, and key events so you know exactly where it's coming from and whether it converts.
Free. No credit card. Connects via Google OAuth.
AI / LLM Sessions
1,847
from 4 platforms
Share of Total
2.8%
65,892 total sessions
Avg Engagement
72.4%
across AI platforms
By Platform
Sessions
1,012
Engaged
68.4%
Events
89
Sessions
423
Engaged
71.2%
Events
37
Sessions
285
Engaged
76.8%
Events
31
Sessions
127
Engaged
79.5%
Events
14
Trend Over Time
↑ GrowingTop Landing Pages from AI Traffic
| /blog/ai-seo-guide | 312 |
| /blog/chatgpt-prompts-seo | 198 |
| /resources/ga4-setup | 143 |
AI platforms refer traffic in inconsistent ways. Some pass referrer headers cleanly — click a link in a ChatGPT response and chatgpt.com appears as the session source in GA4. Others strip the referrer entirely before the click lands on your site, so the session registers as direct/none with no attribution signal whatsoever.
Even when the referrer passes cleanly, GA4's default channel grouping has no native “AI” channel. Traffic from ChatGPT, Gemini, and similar platforms gets classified as “Referral” — buried among hundreds of other referral sources. Without filtering specifically for known AI domains, it's invisible in your channel reports.
Without a custom channel group or a dedicated AI traffic report, LLM referral sessions get mixed into referral and direct/none in proportions that shift as AI platforms evolve their link-handling behavior. What looks like a spike in referral traffic might be a platform update that started passing referrers more consistently. What looks like growing dark traffic might be a new Claude or Perplexity interface that strips them. You can't tell without isolating the signal.
AI referral traffic is small for most sites — typically under 3% of sessions — but it's growing fast, and the composition is different from organic or social. Users arriving from an AI response are usually further along in their research. Engagement rates are often higher. The sites that understand this channel early will have a head start in GEO (generative engine optimization) — the practice of optimizing content to be cited and recommended by AI platforms. You can't optimize what you can't measure.
Terminology
People searching for “AI traffic”, “ChatGPT traffic”, or “LLM referral traffic” are often conflating a few different things. Here's what you're actually looking at.
A user clicks a link inside a ChatGPT, Gemini, or Perplexity response and the platform passes the referrer header. GA4 records session_source = chatgpt.com (or the equivalent for each platform). These sessions are straightforward to identify and aggregate by platform.
Some AI interfaces strip the referrer header before passing the click. The session lands in GA4 as direct/none — indistinguishable from someone typing your URL directly. This is common in some Claude and Perplexity flows, particularly in API-based or app-embedded interfaces. The AI Traffic Report captures clean referrals; the Dark Traffic Analyzer handles the rest.
AI crawlers (GPTBot, ClaudeBot, PerplexityBot, etc.) are automated bots that index your content for training or retrieval — they never trigger GA4 events because no browser JavaScript runs. This report tracks human sessions only: real people who clicked a link in an AI response and landed on your site. Those are two entirely different things.
Connect your GA4 property and get this breakdown in seconds.
Total sessions referred from AI platforms in your selected date range — ChatGPT, Gemini, Claude, Perplexity, Microsoft Copilot, and others. One number, clearly separated from everything else.
AI traffic as a percentage of all sessions, so you can contextualize it. 0.5% looks very different from 5%. Knowing your baseline is the first step to knowing if GEO efforts are working.
Sessions, engagement rate, and key events broken out per AI platform. Not just a count — the quality metrics that tell you whether the traffic is worth pursuing more of.
Which pages AI platforms are sending people to, with session counts. Usually reveals that a handful of content pages are driving most AI referrals — and flags what to write more of.
How your AI referral traffic is growing (or not) over your selected date range. The trend line matters more than the absolute number at this stage of AI search development.
Counting AI referral sessions is a vanity metric if you stop there. A platform sending 50 sessions at 80% engagement is materially more valuable than one sending 500 at 20%. Engagement rate tells you whether the traffic is actually interested in what you're saying — or if they're bouncing because the AI sent them to the wrong page.
Key events are the next level: did AI-referred visitors complete a signup, a download, a contact form submission? If a particular platform consistently outperforms on key events, that's a signal about the intent of the audience it's sending. Platform-level quality data is where AI traffic reporting gets actually useful versus just counting referral rows in a spreadsheet.
This is also where GEO strategy comes in. If ChatGPT is sending high-engagement visitors to your product pages, that's evidence your product content is getting cited in relevant queries. If Perplexity is sending high-engagement visitors to your long-form guides, that's a signal about what type of content Perplexity tends to surface. You can act on that.
Straight answers to what people are actually searching for.
Connect your GA4 property and get your AI traffic breakdown by platform in seconds. Free, read-only access — we never modify your data.
Run My AI Traffic Report →