The VP of Product dropped a screenshot into Slack. Our activation rate: 23%. Below industry average. Below our competitors. Below acceptable.
"Why are 77% of signups never getting value from our product?" she asked.
Nobody had an answer. We'd spent months building onboarding tutorials, tooltips, email sequences, and in-app guides. Our onboarding flow looked beautiful. Modern. Professional. And it was failing three out of four users.
I volunteered to figure out why. Six weeks later, I'd doubled our activation rate to 48%. Not through better tutorials or prettier UI, but by fixing something nobody was looking at.
Here's what I learned about activation rates: The problem is never where you think it is.
The Activation Rate Theater
Before I dug into the data, I sat in on three product team meetings where they discussed "the activation problem."
Every conversation followed the same pattern:
Someone would say: "Users don't understand the product. We need better onboarding content."
Someone else would respond: "We have tons of onboarding content. Users aren't engaging with it. We need to make it more interactive."
Then: "Maybe we should add gamification? Badges? Progress bars?"
These weren't stupid suggestions. They were all reasonable hypotheses about why activation was low.
They were also completely wrong.
The team was treating activation like a content problem or an engagement problem. It wasn't. It was a definition problem.
Nobody could tell me what "activated" actually meant for our product.
I asked the VP of Product: "What does an activated user do?"
Her answer: "They complete the onboarding checklist."
I asked the CEO: "What makes someone activated?"
His answer: "When they invite their team."
I asked the Head of Customer Success: "How do you know someone is activated?"
Her answer: "When they create their first project."
Three different definitions of activation. Three different metrics being tracked. None of them correlating with actual retention.
You can't improve an activation rate when you don't know what activation means.
Defining Activation the Right Way
I spent the first week doing something nobody had done: correlating signup behavior with 90-day retention.
I pulled data on 2,000 recent signups and grouped them by what actions they took in their first 7 days. Then I looked at which groups had the highest retention rates after 90 days.
Users who completed the onboarding checklist: 31% retained at 90 days Users who invited their team: 28% retained at 90 days Users who created their first project: 41% retained at 90 days
None of those were particularly good. But then I noticed something in the data.
Users who created a project AND added at least 5 data points to it: 73% retained at 90 days.
That was it. That was the real "aha moment."
It wasn't about completing onboarding. It wasn't about inviting teammates. It was about getting real data into the product and seeing it work.
Users who hit that threshold—creating a project with meaningful data—almost always stuck around. Users who didn't almost always churned.
Our activation metric was wrong. We'd been optimizing for the wrong behavior.
The product team had designed onboarding around completing a checklist. But completing a checklist doesn't make you see value. Getting results from real data does.
Finding the Real Friction Points
Once I knew what activation actually meant (project + 5 data points), I could identify where users were failing to get there.
I filtered for users who signed up but never hit the activation threshold. Then I watched session recordings of their first interactions with the product.
What I expected to find: Confusion about features. Difficulty understanding the UI. Users bouncing because the product was too complex.
What I actually found: Users understood the product fine. They knew what it did. They could navigate the UI. They just couldn't get their data into it easily enough.
Our product required users to manually input data one item at a time. For a user to hit the "5 data points" threshold that predicted retention, they had to spend 15-20 minutes of tedious data entry.
Most users tried to enter 1-2 items, realized how long it would take to get meaningful results, and quit.
They weren't confused. They were exhausted.
This was an insight the product team had completely missed because they'd been focused on tutorial clarity and feature education. The onboarding flow explained everything perfectly. Users understood it. They just didn't have the patience to manually enter enough data to see value.
The Fix That Doubled Activation
I brought this finding to the product team with a specific recommendation: Build a CSV import feature and make it the primary onboarding path.
The pushback was immediate.
"CSV import is an advanced feature. New users won't have CSVs ready."
"We can't make the product feel too technical during onboarding."
"What about users who don't have existing data to import?"
I showed them the session recordings. User after user attempting manual data entry, getting frustrated, and abandoning the product.
Then I showed them the retention data: 73% of users who got 5+ data points into the product stuck around. Only 12% of users were currently hitting that threshold because manual entry was too painful.
The business case was simple: If we could get more users to 5+ data points, we'd dramatically improve retention. CSV import was the fastest way to get users there.
The product team built it in two weeks. We redesigned the onboarding flow to ask users: "Want to import existing data or start fresh?"
Users who chose import could upload a CSV and have a fully populated project in 30 seconds instead of 20 minutes.
Results after 4 weeks:
- Activation rate (users hitting 5+ data points): 23% → 48%
- 90-day retention: 32% → 54%
- Time to activation: Average 3.2 days → 0.8 days
- Support tickets about "getting started": Down 40%
Same product. Same core features. Just a different path to value.
What This Reveals About Activation
The activation rate problem taught me something uncomfortable about product onboarding: Most companies are optimizing the wrong metrics.
We'd been measuring "onboarding completion" because it was easy to track. But onboarding completion didn't predict retention. It was a vanity metric.
The real activation metric—the behavior that predicted long-term retention—was buried in usage data nobody was looking at.
Here's what I now know about finding your true activation metric:
Start with retention, work backwards. Don't define activation based on what you think matters. Pull retention data and find the early behaviors that correlate with users who stick around.
Look for the first value moment, not the first feature usage. Users don't get activated when they use a feature. They get activated when they get a result they care about.
Measure activation as a combination of actions, not a single event. Our real activation metric wasn't "created a project" (too easy, no correlation with retention) or "completed all onboarding steps" (too arbitrary). It was "created a project AND added meaningful data to it."
Friction isn't always a UI problem. Our onboarding tutorials were clear. Our UI was intuitive. But we still had massive friction because we made value too hard to reach.
The Uncomfortable Pattern I Keep Seeing
Since fixing our activation rate, I've consulted with a dozen other SaaS companies on their activation problems.
The pattern is always the same:
The company has low activation. They blame onboarding clarity. They build more tutorials, tooltips, videos, and guides. Activation stays low. They don't understand why.
Then I ask: "What behavior correlates most with retention?" and they don't know.
They're trying to improve activation without knowing what activation actually means.
Here's the harsh truth: If you can't tell me the specific user behaviors that predict 90-day retention, you can't fix your activation rate. You're just guessing.
Most product teams define activation based on what they built, not based on what predicts retention:
- "Activation is when users complete our onboarding flow" (because we built an onboarding flow)
- "Activation is when users try Feature X" (because we just launched Feature X)
- "Activation is when users invite teammates" (because we have viral growth goals)
None of those matter if they don't predict retention.
The only definition of activation that matters is: "The minimum set of behaviors that predict a user will get long-term value from this product."
Everything else is vanity metrics.
How to Find Your Real Activation Metric
Here's the process I now use with every product:
Pull 90-day retention data for recent signups. You need enough data to see patterns—at least 500-1,000 users who signed up 90+ days ago.
Identify users who retained vs. churned. Retained = still using the product after 90 days. Churned = stopped using it.
Look at first-week behavior for both groups. What did retained users do in their first 7 days that churned users didn't?
Find the behavior with the strongest correlation. This is usually a combination of actions, not a single event. For us, it was "created project + added 5+ data points." For another product I worked with, it was "connected an integration + sent their first automated message."
Validate with cohort analysis. Check whether users who do this behavior in different time periods show the same retention lift. If it's consistent across cohorts, you've found your real activation metric.
Now optimize your onboarding to drive that behavior. Not "complete these steps." Not "watch this tutorial." Drive the specific behavior that predicts retention.
This process takes about a week if you have decent analytics infrastructure. It's the most valuable week you'll spend on product adoption.
What Changed After We Fixed Activation
Doubling our activation rate didn't just improve retention—it changed how the whole company thought about product development.
Product team: Stopped building onboarding features based on gut feel. Started every new feature discussion by asking: "Does this help users reach activation faster?"
Customer success: Shifted focus from "help users complete onboarding" to "help users get 5 data points into their first project." Support tickets became consultative: "What data sources do you have? Let's get those imported."
Marketing: Changed trial messaging from "Start your free trial" to "Import your data and see results in 60 seconds." Conversion rate on trial signups increased 18% because we were setting clearer expectations about what value looked like.
Sales: Used activation data to identify which trial users needed help. Anyone who signed up but didn't hit activation within 48 hours got a personalized outreach: "Want help importing your data?"
Having a clear, retention-validated activation metric gave every team a shared definition of success.
The Mistake Most Teams Make With Activation
The biggest mistake I see teams make with activation isn't having a low rate—it's not knowing why the rate is low.
They look at the top-level number (23% activated) and start brainstorming solutions without understanding the problem.
"Let's improve our tooltips." "Let's add video tutorials." "Let's build an interactive product tour."
All of those might be good ideas. But if you don't know why users aren't activating, you're guessing.
Maybe users aren't activating because they don't understand the product. Maybe they understand it but can't get their data into it. Maybe they're in the wrong ICP and will never get value no matter how good your onboarding is.
You won't know until you dig into the data.
The teams that fix activation rates:
- Define activation based on retention correlation, not feature usage
- Watch session recordings of users who failed to activate
- Identify the real friction point (it's rarely what you expect)
- Build solutions that remove that friction, not generic "better onboarding"
- Measure impact on the metric that actually matters (retention)
The teams that stay stuck:
- Define activation based on completing onboarding steps
- Optimize for engagement metrics (checklist completion, tutorial views)
- Build more content instead of removing friction
- Celebrate "improved onboarding completion" while activation and retention stay flat
I was on the second team until I forced myself to look at retention data. Now I won't touch an activation problem without starting there.
If your activation rate is below 40%, you probably have the wrong activation definition or you haven't identified the real friction point.
Pull your retention data. Find the behavior that predicts retention. Watch users who fail to get there. Fix the friction you find, not the friction you assume exists.
That's how you double activation rates. Everything else is theater.