I pitched a win/loss program to leadership three times. The first two times, I got the same response: "Great idea, but we don't have budget for a vendor right now. Let's revisit next quarter."
The third time, I stopped asking for permission and just started doing it.
I spent 30 days building a win/loss program that cost nothing except my time. I interviewed eight customers. I presented insights at the next QBR. The VP of Sales asked why we weren't doing this for every deal.
I said, "We are now."
That was four years ago. That program is still running. It's influenced pricing changes, product roadmap decisions, and sales playbook updates. It's generated more strategic value than any vendor program could have.
The reason most win/loss programs never launch isn't lack of budget or lack of executive support—it's that PMMs wait for perfect conditions instead of just starting. They think they need a vendor, a formal process, executive sponsorship, and dedicated headcount.
You don't. You need 30 days and a willingness to do eight interviews yourself.
Here's exactly what I did, and what you should do if you want a win/loss program running by the end of next month.
Why Most Win/Loss Programs Never Start
I've watched PMMs try to launch win/loss programs at five different companies. Most fail before they conduct a single interview. Not because win/loss analysis doesn't work—because they spend six months trying to get organizational buy-in instead of three weeks generating insights.
The typical failed approach looks like this:
Month 1: PMM pitches win/loss program to leadership. Leadership says yes in principle, asks for a proposal with budget requirements and ROI projections.
Month 2-3: PMM researches vendors, gets quotes, builds business case. Finance asks for more detail on expected outcomes. Sales asks how this won't create more work for them.
Month 4: PMM presents formal proposal. Leadership approves in concept but says budget isn't available until next fiscal year. PMM is told to revisit in Q4.
Month 5-12: Nothing happens.
I did this twice. Both times, the program died before it started.
The third time, I tried a different approach: I stopped asking permission and started generating results.
I picked eight closed deals from the last 60 days—four wins, four losses. I reached out to the decision-makers directly. I asked for 30 minutes to understand their buying process. Six of them said yes.
I did the interviews myself. I took notes. I looked for patterns. I found three insights that were immediately actionable:
-
We were losing enterprise deals because our security documentation was outdated and hard to find. Prospects couldn't get answers to their security team's questions fast enough.
-
We were winning mid-market deals when our sales engineer customized the demo with customer data. We were losing when we showed generic demos.
-
Our pricing page confused prospects because we had three tiers but most customers couldn't tell which tier they needed.
I shared these insights in Slack with specific recommendations: update security docs, require custom demos for enterprise deals, add a tier selector to the pricing page.
Product updated the security docs in one week. Sales leadership made custom demos mandatory for enterprise opportunities. Marketing added a simple quiz to the pricing page.
Three changes. All happened within two weeks of sharing the insights. All driven by six conversations.
At the next QBR, the VP of Sales asked me to present the win/loss findings. I did. At the end, he asked: "How often are you doing these interviews?"
I said, "I can do 8-10 per month if we make it part of the standard deal close process."
He said, "Let's do that."
That's how the program became official. Not by asking permission upfront, but by proving value first.
The 30-Day Launch Plan That Actually Works
You don't need executive sponsorship to start. You need to complete eight interviews and share insights that drive action. Once you prove value, the organizational support materializes.
Here's exactly what I did in 30 days.
Week 1: Identify the Deals and Build the List
I didn't start by building a perfect process. I started by picking deals I could learn from.
I pulled a list of every closed opportunity from the last 60 days—won and lost. I filtered it down to deals that met three criteria:
Competitive: The deal included at least one competitor. These deals reveal how customers actually compare you to alternatives.
Significant: The deal size was meaningful—at least your average deal value. Small deals often close on price or convenience. Bigger deals reveal strategic decision criteria.
Accessible: I could find the decision-maker's contact information. If I couldn't reach them, they couldn't be on the list.
This gave me 23 potential deals. I picked eight: four wins, four losses. I balanced them across segments (two enterprise, four mid-market, two startup) and competitors (three losses to Competitor A, one to Competitor B).
I wasn't trying to build a statistically significant sample. I was trying to find patterns in real decisions.
The whole thing took me three hours. Pull CRM data, review deal notes, identify decision-makers, build the list.
Week 2: Reach Out and Schedule
This is where most people get stuck. They overthink the outreach. They draft elaborate emails explaining the program and why participation matters.
I kept it simple. I sent this email to all eight decision-makers:
Subject: Quick question about your [product category] decision
Body: Hi [Name],
I saw you recently [chose us / decided to go with Competitor] for [use case]. I'm doing research on how companies evaluate solutions in this space, and I'd love to understand your decision process.
Would you be open to a 30-minute call to walk me through how you made the decision? I'm not trying to sell you anything or change your mind—I genuinely want to understand what mattered most in your evaluation.
I'm happy to share what we're learning from these conversations if it's helpful.
Let me know what works for your calendar.
Thanks, [Your name]
That's it. No corporate jargon. No explanation of "win/loss programs." Just a simple request to understand their decision.
Six of eight responded within three days. Five said yes immediately. One asked for more details, I explained, they said yes. Two never responded.
I sent calendar invites for the following week. All six scheduled.
The key insight: don't oversell it. Most people are happy to talk about their decision process if you're genuinely curious and not trying to sell them something.
Week 3: Conduct the Interviews
I interviewed all six customers in one week. Each interview was 30 minutes. I recorded them (with permission) using Zoom.
I didn't use a rigid script. I had three questions I wanted to explore:
- What triggered the decision to evaluate solutions?
- How did they compare options?
- What made them choose their final vendor?
But I let the conversations flow naturally. I asked follow-up questions. I explored interesting threads. I listened more than I talked.
The hardest part wasn't the interviews—it was resisting the urge to take notes frantically. I learned this the hard way in my first few win/loss interviews. When you're typing constantly, you're not listening fully.
Instead, I just recorded the conversations and wrote down only the key themes as they emerged. After each interview, I spent 15 minutes listening to the recording and documenting the main insights.
By Friday, I had completed all six interviews and had pages of raw notes.
Week 4: Synthesize and Share
This is where most win/loss programs fail. PMMs spend weeks analyzing data, building comprehensive reports, and preparing formal presentations.
I spent three hours looking for patterns and one hour creating a summary.
I reviewed all my notes and looked for themes that appeared in multiple interviews. I didn't do statistical analysis. I just looked for statements like "this came up in four of six conversations."
Three patterns jumped out:
Pattern 1: Customers who chose competitors mentioned our security documentation was hard to navigate. Appeared in three of four losses.
Pattern 2: Customers who chose us mentioned our sales engineer's custom demo. Appeared in three of four wins.
Pattern 3: Customers across both wins and losses mentioned confusion about pricing tiers. Appeared in five of six total interviews.
I turned these into a one-page summary with three sections:
What we learned: [Pattern + quote from customer]
Why it matters: [Business impact]
What we should do: [Specific recommendation]
I didn't make it pretty. I just made it clear.
I shared it in three places:
Slack: Posted in #sales with a note: "Interviewed 6 customers from last month's closed deals. Three quick insights that could help with competitive deals."
Product: Slacked directly to the product manager responsible for onboarding with the security documentation insight and offered to share the full notes.
Sales leadership: Emailed the VP of Sales with the summary and asked if they wanted to discuss at the next leadership meeting.
All three groups responded within 24 hours. Product asked for the full interview notes. Sales leadership asked me to present at the next QBR. The sales team started referencing the insights in deal reviews.
That's when I knew the program would stick.
The Stakeholder Resistance You'll Face
Even when you prove value fast, you'll hit resistance. Here's what I encountered and how I handled it.
"We don't have time to facilitate customer intros"
Sales ops initially pushed back on the idea of systemizing win/loss interviews. They said sales reps were already overwhelmed and wouldn't have time to introduce me to customers.
I didn't argue. I just kept doing the outreach myself. After three months of running the program independently, a sales manager asked me to interview a customer they'd just lost to understand what went wrong.
I did the interview. The insights revealed the competitor had a specific feature we didn't know they'd launched. We lost because we didn't address it in our pitch.
The sales manager asked, "How do we make sure you're talking to all our lost customers?"
I said, "Build it into your close process. When a deal closes—win or loss—intro me to the decision-maker."
They agreed. Within two months, sales reps were proactively introducing me to customers because they wanted the insights.
You can't mandate this from the top. You have to prove value first, then sales will pull you into deals.
"This should be done by a third party for objectivity"
Product leadership initially questioned whether customers would be honest with me since I work for the company. They suggested we needed a third-party vendor to get unbiased insights.
I addressed this by showing them the feedback I was getting. I'd interviewed a customer who chose a competitor and they'd been brutally honest about what we did wrong. I shared the recording.
The product VP listened to 10 minutes of the customer explaining our weak points and said, "Okay, they're definitely being honest."
The objectivity concern disappears when you show stakeholders what customers are actually saying.
"How do we know this is statistically significant?"
Finance asked this when I first presented insights. They wanted to know if six interviews was a large enough sample size to make product decisions.
I explained that win/loss isn't academic research—it's qualitative insight generation. We're not trying to prove causation. We're trying to understand decision factors we didn't know existed.
Then I pointed to the specific changes we'd already made based on the insights and asked: "Which of those three changes do you think we shouldn't have made?"
They couldn't argue with that.
What Makes This Approach Work When Others Fail
I've seen PMMs try to launch win/loss programs with vendor support, formal processes, and executive sponsorship. Most of those programs die after six months because they're too complex and don't generate insights fast enough.
The 30-day approach works because:
You prove value before asking for resources. Instead of pitching a program and waiting for approval, you generate insights and share them. Once stakeholders see the value, they'll ask you to formalize it.
You stay lightweight. No vendors, no tools, no formal process. Just interviews, notes, and insights. You can start tomorrow without budget approval.
You focus on actionable insights, not comprehensive analysis. You're not trying to understand every factor in every decision. You're trying to find patterns that drive specific actions.
You share insights immediately. Don't wait for the perfect report. Share what you learn as you learn it. Speed matters more than polish.
I've launched this exact program at three companies. It's worked every time.
The first company, I was a solo PMM with no budget and skeptical sales leadership. Six months later, we had a formal win/loss process with dedicated headcount.
The second company, I inherited a failed vendor-run win/loss program that nobody used. I scrapped it and started over with the 30-day approach. Within 90 days, we had more actionable insights than the vendor program had generated in two years.
The third company, I was brought in specifically to fix their win/loss program. I used the exact approach outlined here. It became the highest-value program the PMM team ran.
Just Start
The biggest mistake PMMs make with win/loss programs is waiting for perfect conditions. They wait for budget, for executive buy-in, for formal process approval, for the right vendor.
Perfect conditions never arrive.
The best win/loss programs start scrappy. One PMM, eight interviews, 30 days. No budget, no fancy tools, no formal process. Just conversations and insights.
Once you prove it works, everything else follows. Sales wants more insights. Product wants to be involved. Executives ask you to present. Finance approves budget for tools.
But none of that happens until you conduct the first interview.
You can start today. Pull last month's closed deals from your CRM. Pick eight. Email the decision-makers. See who responds.
Thirty days from now, you'll have insights that change how your company sells, prices, and builds product.
Or you can wait for budget approval and still be waiting six months from now.