AI Insights NEW

Don't just record users.
Diagnose what's broken.

Inspectlet AI Insights reads every session and synthesizes them into a ranked diagnosis — which funnels are regressing, which buttons are broken, which pages are hemorrhaging users, and how many people it's hurting. No more hunting through recordings to guess at causes.

AI Insights BETA Insights Page Elements Journeys Funnels
18% frustration rate 24,318 sessions 4,377 frustrated 24h 7d 30d 90d
Conversion on your Checkout funnel regressed this week — drop-off jumped from 22% → 17% on /checkout/address, impacting 1,182 users. Evidence: dead-clicks on the Continue button spiked 3.4×, and 62% of abandons had a validation error on ZIP.
Checkout
7,218 entered · 1,154 converted
16% ▼ -6 pts 2▼
1 /cart 7,218
2 /checkout/address 4,123 ▼ -6 pts conversion dropped 22% → 17% (-6 pts)
3 /checkout/payment 1,891
4 /checkout/confirm 1,154 ▲ +1 pt

Session replay shows what happened.
It doesn't tell you what to fix.

Recordings are raw evidence, not analysis. Figuring out whether the drop-off on checkout is a broken button, a validation error, a copy change, or something else entirely is still your job. Every week. For every page.

You can't watch them all

A busy site produces thousands of sessions a week. Even the best team watches under 1% — and the important ones almost always fall in the other 99%.

Metrics show the drop, not the cause

GA4 says checkout conversion fell 6 points this week. Great. Which step? Which element? Which segment? A metric is an alarm, not a diagnosis.

By the time you find it, it's history

Manual triage takes days. Revenue leaks for a full sprint before anyone notices the real cause. You need something that runs continuously, not a person sampling recordings.

AI reads every session, then writes you a paragraph.

Every few hours, AI Insights synthesizes what it saw across every recording and produces a short diagnosis — the kind of paragraph a senior analyst would write if they had time to watch all 24,000 sessions. No dashboards to interpret, no charts to assemble.

  • Named problems — "address-form validation", not "friction event spike"
  • Impact numbers baked in — exactly how many users, on which routes
  • Evidence links jump straight into representative replays
Executive Summary just now

Your site's biggest user-experience issue this week is the checkout address step — abandonment rose from 22% to 28%, costing roughly 1,182 users on the path to purchase.

The dominant cause is validation errors on the ZIP field: 62% of the users who abandoned saw an inline error, and many of them then rage-clicked the Continue button 3–7 times before leaving. Mobile Safari is disproportionately affected (48% of abandons vs 31% of traffic).

A separate, lower-priority issue: the Homepage hero CTA is dead-clicked 140+ times/day on /pricing-curious traffic; the link was likely removed in last Thursday's deploy.

Watch affected sessions Open funnel

Week-over-week regression detection at the stage level.

Define funnels the way you already think about them — signup, checkout, onboarding — and AI Insights watches each stage individually. When any stage slips, you get the delta in percentage points, the affected session count, and the inline explanation of why.

  • Stage-level deltas — not just overall conversion; AI Insights knows which step slipped
  • AI-detected funnels — if you haven't defined one, the system proposes the paths users actually take
  • Per-day sparklines so you see the trend, not just the number
  • Regression pill on each card (2▼, 1▲) for at-a-glance triage across funnels

Conversion Funnels

Tracked weekly

Checkout
7,218 entered · 1,154 converted
16% ▼ -6 pts 2▼
1 /cart
2 /checkout/address ▼ -6 pts
conversion dropped 22% → 17% (-6 pts)
3 /checkout/payment ▼ -2 pts
Trial Signup
3,812 entered · 1,724 converted
45% ▲ +3 pts
Onboarding Complete
1,724 entered · 1,241 completed
72%

The specific buttons, fields, and links that are broken.

AI Insights labels every interaction across your site — rage clicks, dead clicks, validation errors, JavaScript errors — and aggregates them per element. You get a ranked list of which specific thing needs attention, with impact numbers, not a generic "events" tab.

  • Named elements — "Continue button on /checkout/address", not "div.btn-pri-3"
  • Severity-ranked so the top of the list is always the next thing to fix
  • Typed signals (rage / dead / validation / error) per element, not just a count
  • Spike alerts fire when an element breaks — e.g. right after a deploy
Page Elements ranked by impact All 28 High 6 Med 14
Rage clicks on "Continue" button
/checkout/address
847 rage 412 validation
1,182 sessions
Dead clicks on hero "Start Free" link
/pricing
628 dead spike 3.4×
704 sessions
Validation errors on "ZIP code" field
/checkout/address
391 validation 184 rage
488 sessions
JS error: PaymentService.init()
/checkout/payment
219 errors
219 sessions
Dead clicks on "Try live demo" link
/
92 dead
92 sessions

See where users actually go — and where they actually leave.

The Journeys view maps real navigation flows, step by step, with exit percentages on every page. Click a page to explore where it leads next. The pages bleeding traffic light up immediately.

  • Step-by-step exploration — click a node, see the next set of pages
  • Exit-rate highlighting so high-exit pages jump out visually
  • Route-level session counts for every branch, computed from real traffic

User Journeys

Entry → where users actually go

Entry
/ 8,241
/pricing 3,412
/blog/* 2,118
Step 2
/pricing 3,918 8% exit
/features 2,104 14% exit
EXIT 1,104 13% exit
Step 3
/signup 812 6% exit
EXIT 924 44% exit
/docs 368 17% exit

From reactive triage to proactive diagnosis

Teams already using session replay still spend days figuring out why a number moved. AI Insights collapses that work into a paragraph.

Without AI Insights

Days of manual investigation

  • GA4 shows conversion dropped — but which step?
  • Pull a list of sessions that reached the funnel, sort by duration, watch 30–40 recordings
  • Guess at the common pattern; maybe you see it, maybe you don't
  • Cross-reference with Sentry, Datadog, and your deploy log manually
  • Write up the root cause on day 4; the leak has been bleeding all week
With AI Insights

Minutes to a named, quantified diagnosis

  • Open AI Insights — the regression is already at the top, with the exact stage and delta
  • Read the one-paragraph diagnosis and see the dominant cause (validation / dead click / JS error)
  • Jump into representative sessions with a single click — each one pre-tagged with the signal
  • Know the device/browser skew and session count before you even open the recording
  • Ship the fix the same day; stop the bleed on day 1, not day 5

Other tools count events.
Inspectlet explains causes.

Dashboards that just visualize metrics don't know why anything changed. Inspectlet labels every interaction semantically — rage click, dead click, validation error, JS error — and correlates those labels with business outcomes. That's the layer that lets AI produce real diagnosis, not just prettier charts.

Semantic action labeling

Every click, scroll, form interaction, and error is classified by what it means, not just what it looks like in the page structure. That's the foundation that makes diagnosis possible.

Replay is proof, not evidence

Most products make you watch recordings to form a hypothesis. AI Insights starts from the hypothesis and uses replay to back it up — one click, you're on the exact moment.

Continuous, not batched

Analysis runs on a rolling window with cached results that refresh every few hours. No nightly batch, no "check again Monday" — the diagnosis is current when you open it.

Sits on top of your replay data

No new snippet, no instrumentation, no tagging. If you're already recording with Inspectlet, AI Insights reads that data directly and starts producing diagnoses on day one.

Know what to fix next — before your users do.

Free plan available. No credit card, no tagging, no configuration. If you're already recording with Inspectlet, AI Insights starts producing diagnoses the first time you open it.