Customer Interview Guide: Jobs-to-be-Done Without the BS

Customer Interview Guide: Jobs-to-be-Done Without the BS

I spent two years asking customers what features they wanted. I'd interview them, they'd give me lists of capabilities they wished we had, and I'd dutifully bring those lists to product.

Product would build some of those features. Customers would use them for a week, then go back to their old workflows.

I couldn't figure out why. We were building exactly what customers asked for. Why wasn't it solving their problems?

Then I sat in on a customer interview run by a researcher who understood Jobs-to-be-Done. She didn't ask the customer what features they wanted. She asked them to describe the last time they tried to accomplish a specific task.

The customer spent twenty minutes walking through their workflow. They mentioned our product twice. They spent most of the time talking about spreadsheets, Slack messages, manual data entry, and workarounds they'd built.

At the end, the researcher asked: "If you could wave a magic wand and make one part of that process disappear, what would it be?"

The customer said: "The two hours I spend every Monday reconciling data from three different systems before I can even start my analysis."

That wasn't a feature request. That was a job: "Help me trust my data is accurate without spending hours manually checking it."

We had built features to help them analyze data faster. We'd never built anything to help them trust their data. That's why our features weren't getting adopted—we were solving the wrong job.

That interview changed how I do customer research. I stopped asking what customers wanted and started asking what job they were trying to do.

The Problem With Feature-Based Interviews

Most customer interviews follow the same pattern:

Interviewer: "What features would you like to see in the product?"

Customer: "It would be great if you had [specific capability]."

Interviewer: "What would that help you do?"

Customer: "It would make [task] easier."

This seems like useful research. You're gathering feature requests grounded in customer needs.

But here's what's actually happening: you're asking customers to design solutions before understanding the real problem. And customers are terrible at designing solutions.

They don't understand the constraints you're working within. They don't see your technical architecture. They don't know what's easy versus hard to build. They're just describing a better version of their current solution.

I interviewed a customer who asked for "automated report generation." We built it. It took three months. When we shipped it, they barely used it.

I followed up: "We built the automated reports you requested. Why aren't you using them?"

They said: "Oh, we figured out a workaround using our BI tool. It's good enough."

We'd wasted three months building something the customer didn't actually need because we asked the wrong question.

If I'd asked about the job they were trying to do—"Help me get insights to my stakeholders without manual work"—I would have discovered their BI tool could do that. We could have built an integration instead of a whole new feature.

The job was real. Their proposed solution wasn't the right one.

What Jobs-to-be-Done Actually Means

Jobs-to-be-Done gets treated like a complex framework with specific terminology and rigid methodology. That's not helpful.

Here's the simple version: people don't want products. They want to make progress in their lives. They "hire" products to help them make that progress.

When you understand what job they're hiring your product to do, you can build things that actually help them make progress instead of building features they think they want.

The classic example: people don't want a quarter-inch drill. They want a quarter-inch hole. Understanding the job (make a hole) helps you build better drills. Or maybe something that isn't a drill at all.

In B2B SaaS, this shows up constantly:

Customers say they want "better dashboards." The job they're actually trying to do: "Look smart in exec meetings by having answers to questions about my metrics."

Customers say they want "more integrations." The job: "Avoid having to log into five different systems to do my work."

Customers say they want "advanced analytics." The job: "Find the actionable insight in my data without needing a data science degree."

When you understand the job, you realize better dashboards aren't the answer—alerts when metrics change are. More integrations aren't the answer—a unified workflow view is. Advanced analytics aren't the answer—automated insight detection is.

The feature request points to the job, but it's not the solution to the job.

The Interview Format That Reveals Jobs

I don't use a rigid JTBD interview script. I use a loose framework that lets the conversation flow naturally while making sure I uncover the actual job.

Here's how it works:

Start With a Specific Instance

Never ask: "How do you generally use our product?"

Always ask: "Tell me about the last time you used our product. Walk me through exactly what you were trying to do."

Generic questions get generic answers. Specific questions trigger detailed memories.

When I ask about a specific instance, customers tell me stories: "Last Tuesday, I was preparing for the board meeting and realized our revenue data didn't match what finance was reporting. I spent three hours tracking down the discrepancy..."

That story reveals the job: "Make sure my numbers are defensible before I present to leadership."

If I'd asked generically "how do you use our product for reporting?" I'd get: "I pull reports for board meetings."

That doesn't tell me anything about the job.

Map the Timeline Before and After

Once they've described the specific instance, I map the entire timeline—not just what happened when they used your product, but everything before and after.

"What were you doing right before you opened our product?"

This reveals the trigger—what created the need in the first place.

"After you used our product, what did you do with the output?"

This reveals the goal—what they were ultimately trying to accomplish.

Customers often use your product as one step in a longer workflow. Understanding the full workflow shows you the real job.

I interviewed a customer about how they used our analytics platform. They described pulling a report, exporting it to Excel, reformatting it, copying it into PowerPoint, and presenting it to their team.

The job wasn't "analyze data." The job was "communicate insights to my team in a format they'll actually look at."

We'd built a sophisticated analytics platform. What they actually needed was presentation-ready outputs. We added a "create slide" button that formatted data for PowerPoint automatically. Usage of that feature exceeded usage of our advanced analytics within two months.

Ask About the Struggle

The most revealing question in any JTBD interview is about the struggle—the parts that were hard, frustrating, or time-consuming.

"What was the hardest part of that process?"

"What took longer than you wanted?"

"What almost made you give up?"

The struggle reveals where the job is underserved. That's your opportunity.

I interviewed a customer about how they onboarded new team members to our product. They described creating custom training materials, scheduling multiple training sessions, and spending hours answering questions.

The struggle: "It takes me two weeks to get someone productive with the tool, and I'm doing this every time we hire."

The job: "Get new team members productive fast without creating work for me."

We built an interactive onboarding flow and a certification program. New user time-to-productivity dropped from two weeks to three days. The customer became a reference account.

If I'd asked "what features would help with onboarding?" they probably would have said "better documentation." That wouldn't have solved the real job.

Understand What They Replaced

One of the best ways to understand the job is to ask what they were doing before they started using your product.

"Before you used our product, how were you accomplishing this?"

"What made you decide the old way wasn't working anymore?"

This reveals the job in crystal clarity because it shows you what progress they were trying to make before you existed.

I interviewed a customer who'd switched from a competitor to our product. I asked what they were using before.

They said: "We were using [competitor], but it required our data team to set up every new report. That created a two-week backlog. We switched to you because our business users could create their own reports."

The job: "Get answers to business questions without waiting for engineering."

That insight changed our positioning. We stopped competing on features and started competing on "self-service analytics that doesn't require a data team."

Win rates against that competitor increased immediately.

The Questions I Actually Ask

Here's my loose script for JTBD interviews. I don't follow it rigidly, but I make sure to cover these themes:

Opening: "Tell me about the last time you used our product. Walk me through exactly what you were trying to accomplish."

Timeline before: "What were you doing right before that? What made you decide you needed to do this?"

The process: "Walk me through what you did, step by step. Where did you start? What happened next?"

The struggle: "What was the hardest part? What took longer than you expected? What was frustrating?"

Timeline after: "After you used our product, what did you do with the output? Who did you share it with? What happened next?"

The alternative: "Before you used our product, how were you doing this? What made you change?"

The ideal: "If you could wave a magic wand and make one part of that process disappear, what would it be?"

The context: "How often do you have to do this? Who else is involved? What makes it urgent or important?"

That's it. Eight questions, but I usually only ask five or six depending on where the conversation goes.

The point isn't to check boxes—it's to understand the job.

The Analysis That Actually Matters

After you've done 10-15 JTBD interviews, you need to synthesize what you learned. Most people make this too complicated.

Here's what I do:

List the Jobs

I go through my notes and extract every job statement. A job statement follows this format:

"Help me [accomplish something] so I can [make progress toward a goal]"

Examples from real interviews:

  • "Help me reconcile data from multiple sources so I can trust my analysis is accurate"
  • "Help me spot unusual patterns in my metrics so I can catch problems before they escalate"
  • "Help me explain complex data to non-technical stakeholders so I can get buy-in for decisions"
  • "Help me create reports faster so I can spend time analyzing instead of formatting"

I usually find 15-25 distinct jobs across 15 interviews.

Group by Theme

Some jobs are variations on the same underlying need. I group related jobs together.

All of these are variations on the same job:

  • "Help me reconcile data from multiple sources"
  • "Help me verify my data is correct"
  • "Help me trust my analysis"

The underlying job: "Give me confidence my data is accurate."

After grouping, I usually have 5-8 job themes.

Prioritize by Frequency and Struggle

Not all jobs are equal. I prioritize based on:

How many customers mentioned this job? If 12 of 15 customers talked about data accuracy, that's important.

How much struggle is associated with this job? If customers are spending hours on workarounds, that's a bigger opportunity than something that's mildly annoying.

How well is this job currently served? If customers have good solutions already, there's less opportunity.

The highest priority jobs are those that many customers need, they struggle with significantly, and current solutions don't serve well.

Map to Product Strategy

Finally, I map the high-priority jobs to our product. For each job:

What do we currently offer that serves this job? (Often nothing, or something partial)

What could we build to better serve this job?

What would change about our positioning if we led with this job?

This becomes the input to product roadmap and GTM strategy discussions.

The Difference Between JTBD and Feature Requests

Here's the practical difference:

Feature request thinking: "We need automated report scheduling because customers asked for it."

JTBD thinking: "Customers are trying to get insights to stakeholders regularly without manual work. Automated scheduling is one solution. We could also build Slack alerts, email digests, or dashboard embeds. Which best serves the job?"

JTBD expands your solution space. Feature requests constrain it.

I interviewed a customer who asked for "multi-user collaboration" in our product. That seemed straightforward—build real-time co-editing.

When I dug into the job, I discovered they weren't trying to work on documents simultaneously. They were trying to "make sure everyone on the team is looking at the same version of the truth."

The real problem: they were sharing screenshots and CSVs via email, and people were making decisions based on outdated data.

Building real-time co-editing would have taken six months. Instead, we built shareable links that always showed live data. Took two weeks. Completely solved the job.

If I'd just accepted the feature request, we'd have built the wrong thing.

Why This Is Harder Than It Looks

JTBD interviews sound simple: ask about specific instances, understand the job, build for the job instead of the feature request.

In practice, it's hard because:

Customers don't naturally talk about jobs. They talk about solutions. You have to redirect them constantly from "I want feature X" to "show me what you were trying to accomplish."

The job isn't always obvious. Sometimes you interview someone and come away still not understanding what job they're trying to do. That's normal. It means you need to ask better follow-up questions or interview more people.

Jobs conflict. One customer's job is "move fast without approval processes." Another customer's job is "make sure changes go through proper review." You can't serve both jobs with the same solution.

Jobs evolve. The job customers needed you for two years ago might not be the job they need you for today. You have to keep interviewing.

The first time I tried to run JTBD interviews, I did it wrong. I asked about jobs, but I still approached it like feature discovery. I was looking for validation of what we wanted to build instead of genuine understanding.

It took me probably 30 interviews before I stopped leading customers toward answers I wanted and started actually listening to what they were telling me.

The Uncomfortable Truth

The hardest part of JTBD research is accepting what it reveals.

Sometimes you discover customers are hiring your product for a job you didn't intend to serve. That means your positioning is wrong.

Sometimes you discover the job you thought you were serving isn't actually that important. That means your product strategy is wrong.

Sometimes you discover customers have found better ways to do the job without your product. That means you're not actually necessary.

All of these truths are uncomfortable. They require changing direction, admitting mistakes, or rethinking strategy.

But they're better than building features nobody needs because you asked the wrong questions.

I've wasted months building things customers requested that didn't serve real jobs. I've seen products fail because they were solving jobs that didn't matter.

The companies that win are the ones willing to genuinely understand what job customers are hiring them for, even when that job isn't what they expected.

That requires asking the right questions and actually listening to the answers.

Most companies aren't willing to do that. They'd rather ask what features customers want and build what they say.

That's easier. But it doesn't work.