AB test idea: Try adding an exit pop-up

A/B test adding an exit-intent popup to recover leaving visitors

Exit popups sit in a weird spot in CRO: they either lift revenue per visitor 20–60% or cost you trust and bounce rate. Wisepops' 2025 ecommerce playbook shows countdown-timer popups lift checkout completion roughly 30%, while generic "subscribe to our newsletter" popups often hurt returning-visitor conversion. The question isn't whether to add a popup — it's which trigger, offer, and audience combination wins for your product.

This test matters most for two traffic types: paid visitors you already paid to acquire, and organic visitors mid-funnel (pricing, comparison, feature pages) weighing a decision. A well-targeted exit popup can capture 2–5% of abandoners who would otherwise leave forever.

__wf_reserved_inherit

The hypothesis

If we show a targeted exit-intent popup with a relevant offer to visitors leaving pricing-adjacent pages, trial signups will increase 10–30% without materially hurting main CTA clicks — because we're only interrupting users who had already decided to leave.

Test setup

Run two variants, not three. More variations fragment your sample.

  • Control: No exit popup
  • Variant A: Exit-intent popup with a concrete offer (for example, "Get a 14-day extended trial" or "Book a 15-min demo instead")

Triggering rules to configure:

  • Desktop: Cursor moves above the top of the viewport
  • Mobile: Scroll-up after 50% page depth, or 30 seconds idle — mouse-exit doesn't exist on mobile
  • Frequency cap: Once per visitor per 30 days
  • Page targeting: Pricing, features, comparison pages — not blog or first-visit homepage

Primary and guardrail metrics

Most popup tests fail here — teams track popup conversion in isolation and miss the full-funnel damage.

  • Primary: Trial signups per session, full-funnel — not popup-only captures
  • Guardrail 1: Main CTA click rate — if "Start free trial" clicks drop 10%+, the popup is stealing your primary conversion
  • Guardrail 2: Returning visitor rate at 7 and 30 days
  • Guardrail 3: Bounce rate on popup-targeted pages

Sample size and duration

A pricing page with ~2,000 visitors per week and a baseline 3% signup rate needs about 15,000 visitors per variant to detect a 15% relative lift at 80% power. That's roughly 8 weeks of runtime on a two-variant test. With less traffic, target a 25%+ MDE or stack the test across multiple page types.

Variations to try after the baseline wins

  • Offer type: Discount vs extended trial vs content asset. For B2B SaaS, content offers usually beat discounts.
  • Copy frame: Loss aversion ("You're about to miss…") vs curiosity ("One thing before you go…")
  • Format: Modal overlay vs slide-in vs top banner. Slide-ins feel gentler but get 40–60% lower engagement.
  • Countdown timer: A visible 10-minute countdown lifts claim rate but hurts trust if the deadline is fake.
  • Two-step popup: Yes/no question first, email second. Foot-in-the-door lifts capture 15–25%.

Common mistakes

  • Generic newsletter offers on pricing pages. Intent mismatch. Match the offer to the page the visitor is leaving.
  • Firing on first page view. Require 15s on page or 40% scroll before the popup can trigger.
  • Ignoring mobile. No mouse-exit means you need scroll-up, idle, or back-button triggers — or you lose 50%+ of traffic.
  • Not capping frequency. Three popups per returning visitor destroys trust. Cap once per 30 days; respect dismissals.
  • Measuring only popup engagement. A popup with 20% engagement that drops main CTA clicks 15% is a net loss.
The ultimate A/B testing Webflow app