The CEO pulled me aside after our quarterly all-hands. "I need to understand PMM's impact on pipeline. Can you show me what you're contributing?"
This wasn't a casual question. We were in budget planning season. Every department was fighting for headcount. Marketing wanted to hire two demand gen people. Sales wanted four more AEs. I wanted to hire a second PMM.
The CEO was trying to decide where to invest. And he was asking me to prove PMM deserved that investment.
I had three weeks before budget decisions were finalized.
The problem: I didn't have a clean answer. I knew PMM influenced pipeline—sales told me all the time that our content helped close deals, that our battle cards were used in competitive situations, that our product launches generated interest. But "sales told me" wasn't going to convince a CEO looking at spreadsheets.
I needed data. Real numbers showing PMM's connection to pipeline creation and progression. Numbers that would hold up when the CFO asked questions.
I had three weeks to build a case I'd never built before, using Salesforce data I barely understood, for an executive audience that would scrutinize every assumption.
Here's how I proved PMM's pipeline influence without inflating numbers or losing credibility.
The Influenced vs. Sourced Problem
I started by pulling every opportunity record where any PMM content had been used. Downloads of our whitepapers, views of our case studies, attendance at product webinars, usage of competitive battle cards—I created a massive spreadsheet of every pipeline touchpoint.
Then I added up the pipeline where PMM had "influenced" the opportunity.
The number was huge: $47M in influenced pipeline.
I showed this to our VP of Revenue Operations before presenting to the CEO. He looked at my methodology and said, "You're going to get destroyed."
"Why? The data is accurate. PMM content was used in all these deals."
"That's not influence. That's presence. Influence means you changed the outcome. Presence just means you were there."
He pulled up a few examples from my list:
"This $2M enterprise deal—you're claiming PMM influence because the prospect downloaded a whitepaper eight months ago when they were in early research. But the deal happened because our AE had a pre-existing relationship with their VP. The whitepaper didn't create this opportunity."
"This competitive deal—you're claiming influence because sales opened a battle card. But the prospect never mentioned the competitor again after the first call. The battle card didn't change anything."
"This expansion deal—you're claiming influence because the customer attended a product webinar. But they were already a customer. They were already going to expand. The webinar was just education, not influence."
He was right. I'd counted every instance where PMM assets existed in a deal, regardless of whether those assets actually mattered.
My $47M "influenced pipeline" number would fall apart under questioning.
"So what's the right way to measure influence?" I asked.
"Separate influenced from sourced, and only claim influence where you can show PMM changed the outcome."
Building the Framework: Three Types of Pipeline Contribution
I rebuilt my analysis around three distinct types of PMM pipeline contribution:
PMM-Sourced Pipeline: We Created This Opportunity
This is pipeline that wouldn't exist without PMM's work.
The clearest example: product launch campaigns.
When we launched a new product, PMM drove all launch activities—messaging, positioning, launch content, launch campaigns, launch webinars. Every opportunity created for that new product within 90 days of launch was legitimately PMM-sourced.
I filtered Salesforce for:
- Opportunities created within 90 days of our Q4 product launch
- Primary product = the newly launched product
- Opportunity source = webinar, content download, campaign response, or website inquiry
Result: $8.2M in pipeline that directly traced back to PMM-led launch activities.
This was defensible. A new product launched, PMM ran the launch campaigns, people responded, opportunities were created. Clear causation.
I applied the same logic to other PMM-led initiatives:
- Competitive comparison content campaign: $1.8M in sourced pipeline
- Industry-specific positioning campaign: $2.4M in sourced pipeline
Total PMM-sourced pipeline: $12.4M.
Much smaller than my original $47M, but completely defensible.
PMM-Influenced Pipeline: We Accelerated or Expanded This Deal
This is pipeline where PMM didn't create the opportunity but measurably changed the outcome.
The key word: measurably. I needed proof that PMM's involvement mattered.
Example 1: Competitive Displacement
I worked with sales ops to identify competitive deals where:
- The competitor was explicitly named in the opportunity record
- Sales marked "used competitive content" in the deal
- Sales won the deal
Then I compared win rates:
- Competitive deals where sales used PMM battle cards: 58% win rate
- Competitive deals where sales didn't use battle cards: 31% win rate
The 27-percentage-point difference was PMM's measurable influence.
I calculated: "In competitive deals worth $18.6M where sales used battle cards, our win rate was 27 points higher than competitive deals without battle cards. PMM measurably influenced these outcomes."
I didn't claim we created these opportunities. Sales sourced them. But we influenced the win rate.
Example 2: Deal Expansion
I analyzed opportunities where:
- Initial opportunity size was recorded at creation
- Final opportunity size at close was larger
- Sales had marked "used PMM content" during the sales cycle
I found 23 deals where opportunity size had expanded significantly after PMM content was introduced. Average expansion: 34%.
I couldn't prove PMM caused all of that expansion. But I could show correlation: "23 deals worth $9.4M expanded by an average of 34% after PMM content was introduced to the sales process."
Again, not claiming we sourced these deals. Claiming we influenced their size.
Example 3: Pipeline Acceleration
I tracked sales cycle length for deals where:
- Sales used PMM demo scripts and battle cards from first demo forward
- vs. deals where sales eventually used PMM content but only late in the cycle
Deals where PMM content was used early: 67-day average sales cycle. Deals where PMM content was used late or not at all: 89-day average sales cycle.
22-day difference. PMM content used early in the sales process correlated with 25% faster deal velocity.
I presented this as: "Deals worth $14.2M where sales used PMM content from first demo forward closed 25% faster than deals where PMM content was introduced late."
Not claiming we created the pipeline. Claiming we accelerated it.
Total PMM-influenced pipeline: $42.2M (competitive, expansion, and acceleration combined).
PMM-Assisted Pipeline: We Supported, But Can't Prove Impact
This is pipeline where PMM assets were used, but I couldn't show measurable difference in outcomes.
I stopped trying to calculate precise influence. Instead, I just reported facts:
"PMM content was used in 184 additional deals worth $31.8M. We assisted these deals but cannot demonstrate measurable impact on creation, size, or close rate."
This was honest. Sales used our stuff. But I couldn't prove it changed anything.
The CEO would appreciate honesty more than inflated claims.
The Presentation That Convinced the CEO
I built a simple presentation showing PMM's pipeline contribution in three clear buckets:
Slide 1: PMM-Sourced Pipeline: $12.4M
- Product launches: $8.2M
- Competitive content campaigns: $1.8M
- Industry positioning campaigns: $2.4M
Methodology: Opportunities created within 90 days of PMM campaigns, directly traceable to PMM activities.
Slide 2: PMM-Influenced Pipeline: $42.2M
- Competitive displacement: $18.6M (27-point win rate improvement in deals with battle card usage)
- Deal expansion: $9.4M (34% average expansion correlated with PMM content introduction)
- Pipeline acceleration: $14.2M (25% faster close in deals with early PMM content usage)
Methodology: Measurable differences in win rate, deal size, or velocity correlated with PMM content usage.
Slide 3: PMM-Assisted Pipeline: $31.8M
- 184 deals where sales used PMM content
- No measurable outcome difference
- Assisted, not influenced
Methodology: PMM assets present in deal, no proven impact on outcomes.
Total Measurable Pipeline Contribution: $86.4M
I presented this to the CEO and CFO. The CFO immediately started probing methodology.
"The $12.4M sourced number—walk me through how you're confident these opportunities wouldn't have happened without PMM."
I explained the product launch logic: new product, launch campaign, opportunities created for that specific product within 90 days. Clean causation.
He nodded. "That's defensible."
"The influenced number—you're showing correlation, not causation. How do you know battle card usage caused the higher win rate and wasn't just used in easier deals?"
"Fair question. I looked at that. We used battle cards in deals against our toughest competitors—Competitor A and Competitor B—not just easy wins. The win rate lift held even when controlling for competitor difficulty."
He nodded again. "Okay."
"The assisted number—why include it if you can't show impact?"
"Because I wanted to be honest. Sales used our content in $31.8M worth of deals. That's real usage. But I can't prove it mattered, so I'm not claiming influence. I'm just reporting the facts."
The CEO jumped in: "I appreciate that. Most people would've claimed they influenced all of it."
The CFO asked a few more questions about data sources and sample sizes. Every answer held up because I'd worked with RevOps to validate the methodology.
At the end of the meeting, the CEO said, "This is the clearest pipeline attribution analysis I've seen from any marketing function. Approved—hire your second PMM."
Budget meeting three days later: I got my headcount.
What Made This Analysis Credible
Looking back, several decisions made this pipeline analysis survive executive scrutiny:
I separated sourced from influenced from assisted.
I didn't lump everything into one "PMM-influenced pipeline" bucket. I was honest about different levels of contribution.
I showed methodology for every number.
Executives don't trust conclusions. They trust data and logic. I showed exactly how I calculated each number and what assumptions I made.
I admitted limitations.
When I couldn't prove causation, I said so. That honesty made the numbers I could prove more believable.
I worked with RevOps to validate the data.
I didn't present this analysis until RevOps had reviewed the Salesforce queries and confirmed the data was clean. Their validation mattered.
I focused on outcomes, not activities.
I didn't show how many battle cards I created or how many webinars I ran. I showed pipeline created, win rates improved, deals expanded, sales cycles shortened. Outcomes, not outputs.
The Uncomfortable Questions This Raised
Proving PMM's pipeline influence created an unexpected problem: now I was accountable for it.
The quarter after my CEO presentation, our product launch underperformed. We only generated $4.1M in pipeline instead of the $8M+ I'd shown as typical.
The CEO asked, "Why is launch pipeline half of what you showed me as normal?"
Now I had to explain what went wrong. The methodology that proved PMM's value when things went well also exposed PMM's shortfalls when things went poorly.
This is the double-edged sword of attribution. Once you prove you contribute to pipeline, you own that contribution. You can't claim credit for wins and dodge responsibility for losses.
But I'd rather have that accountability than have executives question whether PMM matters at all.
The Real Value: Getting a Seat at Revenue Planning
The pipeline analysis did something more valuable than justify headcount.
It got PMM into revenue planning conversations.
After the CEO presentation, I started getting invited to:
- Monthly pipeline review meetings (previously sales and marketing only)
- Quarterly revenue forecasting (previously finance, sales, and marketing only)
- Territory planning discussions (previously just sales ops)
Why? Because I'd proven PMM understood pipeline data and could speak the language of revenue contribution.
In those meetings, I could influence decisions before they were made:
- "If we're expanding into healthcare, we should allocate budget for industry-specific positioning, not generic content. Our data shows industry campaigns generate 3x the pipeline of generic campaigns."
- "If we're forecasting based on Q4 close rates, factor in that we're launching new competitive content in November. Historical data shows competitive win rates improve 15-25% within 60 days of new battle card deployment."
- "If we're behind on enterprise pipeline, launching an enterprise-tier product feature could generate $6-8M in qualified pipeline based on past launch performance."
PMM went from "creates marketing content" to "contributes to revenue strategy."
All because I'd proven PMM's connection to pipeline with data executives trusted.
What I'd Do Differently
If I were building this analysis today, I'd make one change: I'd track it continuously instead of building it once.
I created this pipeline attribution analysis in three weeks for a CEO presentation. It was a sprint.
What I should have done: built the reporting infrastructure to track these metrics every month, so pipeline contribution was always visible, not something I had to prove during budget season.
Now I maintain a monthly dashboard showing:
- PMM-sourced pipeline (updated after every launch or campaign)
- PMM-influenced pipeline (win rate analysis, deal expansion, velocity improvements)
- PMM-assisted pipeline (usage metrics without outcome claims)
It takes 30 minutes a month to update. And I never have to scramble to prove PMM's value again.
If you're building pipeline attribution for the first time, don't treat it as a one-time project. Build the repeatable reporting structure so you can track PMM's contribution continuously.
Your CEO will ask again. Be ready.