What 50 User Onboarding Interviews Revealed

Kris Carter Kris Carter on · 10 min read
What 50 User Onboarding Interviews Revealed

I spent 6 weeks interviewing users about their first experience with our product. What I learned completely changed our onboarding strategy and activation rate. Most of what we thought we knew was wrong.

Our activation rate was stuck at 47%. We'd tried everything: new onboarding flows, better tutorials, in-app guides, email sequences. Nothing moved the needle.

The product team insisted the issue was user quality. "We're attracting the wrong ICP," they said.

I didn't believe that. So I did something simple that nobody had done: I asked users what happened during their first week.

I scheduled 50 interviews over 6 weeks:

  • 20 users who activated successfully
  • 20 users who signed up but never activated
  • 10 users who activated late (after 7+ days)

What I learned completely changed our onboarding strategy.

The insights weren't in our analytics. They were in the stories users told me about their first experience with our product.

Interview 1: The Moment I Realized We Were Wrong

My first interview was with Sarah, a marketing manager who'd signed up 3 weeks ago and never activated.

I expected her to say she didn't need the product or couldn't figure out how to use it.

Instead, she said: "Oh, I love what your product does. I've been meaning to set it up but I haven't had time."

Me: "What would 'setting it up' entail?"

Sarah: "Well, I need to export my data from [Tool X], clean it up, figure out your CSV format, upload it... It's on my list but it's not urgent enough to prioritize."

She wanted to use the product. She understood the value. But the activation path required work she didn't have time for.

This wasn't a product problem or an education problem. It was a friction problem we'd never identified.

I heard variations of this story in 14 of my 20 interviews with non-activated users.

The Five Patterns That Emerged

After 50 interviews, clear patterns emerged. These weren't the problems we'd expected.

Pattern 1: "I Wanted to Try It, But Life Got In the Way"

How often I heard this: 14 out of 20 non-activated users

The story was always similar:

  • User signed up with intent to use the product
  • First session went fine—they understood the concept
  • Setup required some work (export data, configure settings, connect integrations)
  • User planned to come back and finish setup "later"
  • "Later" never happened

Representative quote: "I signed up on a Tuesday afternoon. The setup looked like it would take 30-45 minutes and I had a meeting in 20 minutes. I figured I'd come back Thursday. Then I forgot about it until I saw your email 2 weeks later, and by then I'd moved on."

What we learned: The gap between signup and activation is a danger zone. Any friction that causes users to leave without completing activation means most won't come back.

What we changed: We redesigned onboarding to deliver value in the first session, even if setup was incomplete. Used sample data to show what results would look like, then let users finish proper setup later once they were invested.

Result: Users who got value in Session 1 returned for Session 2 at 3x the rate of users who saw only setup screens.

Pattern 2: "I Didn't Understand Why This Mattered Until Week 3"

How often I heard this: 8 out of 10 late activators (users who activated after 7+ days)

These users eventually activated, but it took them a while to understand why they needed the product.

Representative quote: "When I first signed up, I thought it was just a nice-to-have. Then two weeks later, my boss asked for an analysis I couldn't produce with my current tools. I remembered your product and thought 'oh, this is exactly what I needed.' Wish I'd set it up earlier."

What we learned: Many users sign up before they have an urgent need. They're "future-proofing" or "exploring options." These users won't activate until they encounter the specific problem the product solves.

What we changed:

  1. In onboarding, we asked: "What problem are you trying to solve this week?" and focused setup on that specific use case
  2. For users who said "just exploring," we set up a trigger: send email when they're likely to encounter the problem (based on behavioral patterns we identified)
  3. We added a "common use cases" section showing specific scenarios where users would need the product

Result: Late activators decreased from 8% of activated users to 3%. We got users to their "oh, I need this now" moment faster.

Pattern 3: "I Thought It Would Be Like [Competitor], But It Wasn't"

How often I heard this: 11 out of 50 users (spanning all segments)

Users brought mental models from other tools and were confused when our product worked differently.

Representative quote: "I assumed the dashboard would work like [Competitor's]. When it didn't, I didn't know where to find things. I spent 20 minutes clicking around looking for the export button before I gave up."

What we learned: We'd designed our product to be "better" than competitors, but different UX patterns meant users couldn't transfer their existing knowledge. This created cognitive overhead.

What we changed:

  1. Added a "switching from [Competitor]?" guide to onboarding that mapped familiar workflows to our product
  2. Made certain UI patterns match industry standards even if we thought our way was "better"
  3. Added tooltips explaining why we did things differently when we intentionally diverged from norms

Result: Users switching from competitors activated 18% faster after these changes.

Pattern 4: "I Didn't Know You Could Do That"

How often I heard this: 17 out of 20 successfully activated users

This was the most surprising pattern. Even users who activated successfully didn't know about features they would have found valuable.

Representative quote: "Wait, you can schedule automated reports? I've been manually running the same report every Monday for 3 months. If I'd known that was an option I would have set it up day 1."

What we learned: Our onboarding showed users the basics to get activated, but didn't surface power features that would make the product more valuable. Users were succeeding but getting 30% of the value they could get.

What we changed:

  1. Added contextual prompts: "You've run this report 3 times. Want to schedule it to run automatically?"
  2. Created a "power features" email sequence sent at days 7, 14, and 21 after activation showing advanced capabilities
  3. Built an in-app "feature discovery" section showing capabilities relevant to their usage patterns

Result: Power feature adoption increased from 31% to 48% of activated users. These users had 2.4x higher retention.

Pattern 5: "I Almost Quit 3 Times Before It Clicked"

How often I heard this: 12 out of 20 successfully activated users

These were users who struggled through onboarding, almost gave up multiple times, but eventually figured it out.

Representative quote: "The first time I tried to upload my data, it failed. No error message, just nothing happened. I tried again, same thing. I was about to give up, but I wanted the product to work so I kept trying. Turned out my file was too large but the product never told me that."

What we learned: Even users who eventually activate often have near-death experiences with our product. They're succeeding despite our onboarding, not because of it.

What we changed:

  1. Added clear error messages with specific fixes: "File too large. Maximum size is 10MB. Here's how to reduce your file size..."
  2. Implemented session tracking to identify "struggle signals" (repeated failed attempts, long pauses, returning to same screen multiple times)
  3. Created intervention triggers: if user shows struggle signals, show contextual help or offer chat support

Result: Users who received interventions activated at 67% rate vs. 34% for similar users without intervention.

The Insights We Never Would Have Found in Analytics

Analytics told us WHERE users dropped off. Interviews told us WHY.

Analytics showed: 40% of users who started onboarding never completed the "connect data source" step.

Interviews revealed why:

  • 35% didn't have data ready to connect (they expected to use sample data first)
  • 25% couldn't find their API credentials (our instructions assumed they'd know where to find them)
  • 20% started the connection process but got interrupted and forgot to come back
  • 10% didn't trust us with their data yet (wanted to see the product work first)
  • 10% couldn't connect because their tool wasn't on our integrations list

Each of these required different solutions:

Didn't have data ready: Let them proceed with sample data, connect real data later

Couldn't find credentials: Added step-by-step guides with screenshots for each integration

Got interrupted: Save progress automatically, send reminder email if they don't finish within 24 hours

Trust issues: Show what the product does with sample data before asking for their data

Integration not supported: Added CSV upload as a fallback, collected requests to prioritize integration development

Without interviews, we would have just tried to "make the connection step clearer." That wouldn't have solved most users' actual problems.

The Questions That Uncovered the Best Insights

Here are the questions that generated the most valuable insights:

"Walk me through your first session with the product"

Not "what did you think?" but "what did you do?"

This got users to recreate their actual behavior instead of their perceived behavior.

What I learned: Users often got stuck on details they couldn't even remember later. They'd spend 10 minutes trying to understand a confusing label, then move on and forget it ever happened. But in the moment, that 10 minutes of confusion almost made them quit.

"Tell me about a moment where you almost stopped using the product"

Every user, even successful ones, had these moments.

What I learned: The path to activation is full of near-death experiences. Users don't quit because of one big problem—they quit because of accumulated small frustrations that cross a threshold.

Identifying these near-death moments showed me where to add friction removal and support.

"What were you expecting to happen that didn't?"

This revealed mismatches between user mental models and our product design.

What I learned: We'd designed features to work a certain way because it was "better," but users expected them to work like competitor products. Fighting user expectations created cognitive overhead.

Sometimes matching expectations (even if our way was "better") reduced friction more than trying to educate users on why our way was superior.

"If you were explaining this product to a colleague, what would you say?"

This revealed how users understood our value prop vs. how we messaged it.

What I learned: Users described our product completely differently than our marketing did.

Our messaging: "Analytics platform for data-driven decision making" How users described it: "It's like [Competitor X] but faster and easier to set up"

We updated our positioning to match how users actually talked about the product. Conversion improved because prospects immediately understood what we were.

"What would have made your first week easier?"

This question generated tactical improvement ideas.

What I learned: Users had brilliant ideas we'd never thought of:

  • "A checklist showing me what I've completed and what's left"
  • "Example results from companies like mine so I could see what good looks like"
  • "Video showing someone actually using it, not just explaining features"
  • "Ability to invite a teammate immediately so we could figure it out together"

We implemented 8 suggestions from these interviews. Activation rate improved 12 percentage points.

The Uncomfortable Patterns in Successful Users

The interviews with successfully activated users revealed uncomfortable truths.

Many activated users succeeded despite our onboarding, not because of it:

"I figured it out by trial and error. The tutorials didn't help much."

"I just ignored all the tooltips and clicked around until I understood it."

"I asked ChatGPT how to use your product and followed those instructions instead of yours."

These users had high tolerance for friction and strong motivation. They wanted the product to work, so they pushed through bad onboarding.

But most users don't have that tolerance. The users who didn't activate weren't less capable—they just didn't have the time or motivation to fight through friction.

Key insight: Success stories from activated users were masking onboarding problems. The users who succeeded would have succeeded with almost any onboarding. The users who failed needed less friction.

We'd been optimizing onboarding for power users who didn't need help instead of for casual users who needed everything to be obvious.

What We Changed Based on Interviews

Here's what we implemented directly from interview insights:

1. First-session value delivery

Before: Onboarding required completing full setup before seeing value After: Show value with sample data in first 5 minutes, then let users do proper setup

Impact: Users who completed first session increased from 64% to 82%

2. Progress saving and reminder system

Before: If users left during onboarding, they had to start over After: Save progress automatically, send reminder if not finished within 24 hours

Impact: Return rate for incomplete onboarding: 12% → 38%

3. Contextual error messages with solutions

Before: Generic errors like "Something went wrong" After: Specific errors with fixes: "File too large (12MB). Maximum is 10MB. Here's how to reduce size..."

Impact: Support tickets during onboarding: -47%

4. Integration guides with screenshots

Before: "Connect your account" with no guidance After: Step-by-step guides for each integration with screenshots showing where to find credentials

Impact: Integration connection success rate: 58% → 79%

5. Struggle signal detection and intervention

Before: No help unless users asked After: Proactive help when we detect struggle signals (repeated failures, long pauses, multiple returns to same screen)

Impact: Activation rate for users showing struggle signals: 34% → 67%

6. Use-case-specific onboarding

Before: Same onboarding for everyone After: "What are you trying to accomplish?" branching to relevant setup path

Impact: Time-to-activation: 4.2 days → 2.3 days

Combined impact of all changes: Activation rate increased from 47% to 68% over 12 weeks.

How to Run Onboarding Interviews

Here's my process:

Recruit 50 users:

  • 20 who activated successfully (completed your activation trigger)
  • 20 who signed up but never activated
  • 10 who activated slowly (took 7+ days)

Offer incentive:

  • $50 gift card for 30-minute call
  • Or free month of product
  • Or early access to beta features

Schedule interviews:

  • 30 minutes each
  • Record with permission
  • Use video so you can see their screen when they demonstrate

Ask open-ended questions:

  • "Walk me through your first session"
  • "When did you almost quit?"
  • "What didn't work how you expected?"
  • "What would have made it easier?"

Listen for patterns:

  • After 10 interviews, patterns start emerging
  • After 20 interviews, you've heard most of the major themes
  • After 50 interviews, you have deep conviction about what to fix

Synthesize insights:

  • Group feedback into themes
  • Prioritize by frequency and impact
  • Share video clips with product team (way more impactful than written summaries)

Implement changes:

  • Start with quick wins (error message improvements, tooltip clarifications)
  • Then tackle bigger changes (onboarding flow redesigns, new features)
  • Measure impact on activation rate

Repeat quarterly:

  • Onboarding problems evolve as your product and ICP change
  • Regular interviews keep you connected to user reality

The Uncomfortable Truth About Onboarding

Most product teams build onboarding based on assumptions and analytics.

Assumptions: "Users probably want to learn about Feature X first" Analytics: "60% of users drop off at Step 3"

Neither tells you what users are actually experiencing.

Interviews reveal:

  • Why users drop off (analytics just shows that they did)
  • What users were thinking at the moment they almost quit
  • How users describe the product (vs. how you describe it)
  • Which problems users were trying to solve
  • When users realized the product could help them

This context is what drives breakthroughs.

We'd spent months trying to fix our 47% activation rate with analytics-driven changes. We improved messaging, simplified UI, added tutorials.

Nothing worked until we talked to users.

The problems users had weren't the problems we thought they had:

  • We thought onboarding was too complicated. Users thought it required too much upfront work.
  • We thought users didn't understand features. Users understood fine but didn't have their data ready.
  • We thought users needed more education. Users needed less friction.

50 interviews taught me more than 6 months of analytics review.

If your activation rate is stuck, stop looking at dashboards and start talking to users.

The insights you need are in the stories they tell you, not in the numbers.

Kris Carter

Kris Carter

Founder, Segment8

Founder & CEO at Segment8. Former PMM leader at Procore (pre/post-IPO) and Featurespace. Spent 15+ years helping SaaS and fintech companies punch above their weight through sharp positioning and GTM strategy.

Ready to level up your GTM strategy?

See how Segment8 helps GTM teams build better go-to-market strategies, launch faster, and drive measurable impact.

Book a Demo