I wanted a speaking slot at SaaStr Annual. It's the biggest B2B SaaS conference in our industry. 13,000 attendees. All our target customers would be there.
Getting a speaking slot would be massive for our brand. It would position us as thought leaders. It would generate pipeline.
I submitted a proposal: "5 Product Marketing Best Practices for B2B SaaS Companies"
Rejected.
I tried again: "How We Built Our Product Marketing Function From Scratch"
Rejected.
Third attempt: "The Future of Product Marketing: Trends for 2024"
Rejected.
I submitted five proposals over two years. Every single one was rejected.
Meanwhile, our competitors were getting accepted. I'd watch their sessions and think: "Our content is just as good. Why aren't we getting picked?"
Then I attended a panel at a different conference where the moderator was a conference organizer. Someone in the audience asked: "What makes a good conference proposal?"
Her answer changed everything: "We reject 95% of proposals because they're either product pitches disguised as education, or they're generic best practices we've heard a hundred times. We're looking for opinionated, tactical sessions that teach attendees something they can actually use."
That's when I realized what I was doing wrong.
My proposals were safe. They were broad. They promised "best practices" without being specific about what attendees would learn or why our perspective was different from the 50 other similar sessions.
Conference organizers don't want best practices. They want controversial opinions, tactical frameworks, and lessons learned from failure.
I completely rewrote my approach to conference proposals. The next one got accepted.
Here's what changed.
The Proposal Mistake That Got Me Rejected Five Times
My rejected proposals all had the same problem: they were about me and my company, not about solving attendees' problems.
Rejected proposal example:
Title: "Building a Product Marketing Function at a High-Growth SaaS Company"
Description: "Learn how we built our product marketing team from 1 to 6 people and established PMM as a strategic function. I'll share our journey, our wins, our mistakes, and the frameworks we use today."
Why it was rejected:
Problem 1: It's a case study about my company, not a transferable framework.
Conference organizers know that attendees don't care about my company's journey. They care about frameworks they can apply to their own companies.
Problem 2: It's not specific about what attendees will learn.
"Frameworks we use" is vague. What frameworks? What will attendees be able to do differently after this session?
Problem 3: It's not differentiated.
Fifty other PMMs submitted proposals about "how we built our PMM function." Why should the conference pick mine over theirs?
Problem 4: No clear takeaway.
If someone attends this session, what's the one thing they'll walk away with? I didn't answer that question.
Accepted proposal example:
Title: "Why Your Competitive Intel Program is Failing (And The 3 Frameworks That Fix It)"
Description: "Most competitive intelligence programs fail because PMMs treat them like research projects instead of revenue drivers. Sales ignores your battle cards. Your competitive newsletter has 12% open rates. Leadership questions the ROI.
In this session, I'll show you the three frameworks that transformed our competitive program from 'nice to have' to 'directly influenced 35% of deals':
- The Real-Time Intel Feed (how to deliver competitive insights when reps need them, not in quarterly reports)
- The Objection Response Script (moving from 'here's what to say' to 'here's how to win')
- The Competitive Scorecard (proving ROI to leadership with metrics they care about)
You'll leave with templates you can implement immediately and a plan to rebuild your competitive program around what actually drives revenue."
Why it was accepted:
Strength 1: It leads with a problem, not a story about my company.
"Your competitive intel program is failing" immediately resonates with the 60% of attendees who are struggling with this exact problem.
Strength 2: It's specific about what attendees will learn.
Three frameworks. Clear names. Clear outcomes. Attendees know exactly what they're getting.
Strength 3: It promises tactical takeaways.
Templates they can use immediately. Not abstract best practices, but actual tools.
Strength 4: It's opinionated.
"Most competitive intel programs fail because..." is a strong POV. It creates tension and curiosity.
Strength 5: It solves a real, urgent problem.
Conference organizers know that "how to get sales to use competitive intel" is a pain point for PMMs. This session directly addresses that pain.
The shift from "here's what we did" to "here's how to solve this problem you're facing" was the difference between rejection and acceptance.
The Thought Leadership Positioning That Gets You Noticed
Getting a speaking slot isn't just about writing a good proposal. It's about being known for something before you submit.
Conference organizers get thousands of proposals. When they review submissions, they're asking: "Do I know this person? Have I seen their name before? Are they an expert on this topic?"
If the answer is "no," your proposal is competing with hundreds of others from people they've never heard of.
If the answer is "yes," you're starting with credibility.
How I built thought leadership before submitting proposals:
I wrote about the topic publicly for 6 months before submitting.
I wanted to speak about competitive intelligence, so I:
- Wrote 6 LinkedIn posts about competitive intel challenges and frameworks
- Published 2 blog posts on our company blog
- Commented on competitive intel discussions in PMM Slack communities
- Shared templates and frameworks openly (battle card template, competitive scorecard)
By the time I submitted my proposal, conference organizers could Google me and find:
- Consistent content on competitive intelligence
- Engagement from other PMMs (comments, shares, discussion)
- Evidence that I had a POV and frameworks to share
I connected with conference organizers months before the CFP opened.
I didn't wait until proposal season to reach out. I:
- Followed conference organizers on LinkedIn
- Commented on their posts about conference planning
- Shared what topics I'd like to see covered (including topics I could speak on)
- Attended their previous conferences and mentioned sessions I loved
When they reviewed my proposal, my name wasn't completely unknown. They'd seen me in their network.
I spoke at smaller conferences first.
Before applying to SaaStr Annual (13,000 attendees), I spoke at:
- Regional SaaS meetups (50-100 attendees)
- Virtual PMM conferences (300-500 attendees)
- Podcast appearances
By the time I applied to the big conference, I could include in my bio:
- "Speaker at [Conference A] and [Conference B]"
- "Host of [Podcast] with 5,000+ downloads"
The insight: Conference organizers are more likely to accept speakers who've proven they can deliver good sessions elsewhere.
I had advocates who could vouch for me.
I asked three people with connections to the conference organizers to recommend me:
- A previous speaker who'd had a well-attended session
- A sponsor of the conference who knew the organizers
- Someone on the conference advisory board
They didn't formally endorse me, but they mentioned to organizers: "You should check out [my name]'s proposal on competitive intelligence."
Internal advocacy matters.
The uncomfortable truth: Conference speaking is 50% proposal quality, 50% who you know and whether you're known for something.
Build your thought leadership before submitting. Make it easy for conference organizers to say "yes" because they've seen your name, read your content, and trust you can deliver value.
The Session Design That Actually Delivers Value
Getting the speaking slot was step one. Delivering a session that attendees loved was step two.
My first speaking engagement was rough. I prepared a slide deck with 40 slides covering everything I knew about competitive intelligence.
I talked at the audience for 40 minutes. I rushed through frameworks because I ran out of time. I got through the Q&A and thought: "That went okay."
The feedback surveys disagreed. Average rating: 3.2 out of 5.
Comments:
- "Too much content, not enough depth"
- "Wished there was more time for Q&A"
- "Would have liked actual examples and templates"
I'd made the classic mistake: I tried to cover everything instead of going deep on the most valuable thing.
What I changed for my next session:
Ruthlessly narrow the scope.
Instead of "everything you need to know about competitive intel," I focused on one specific problem:
"Your sales team ignores your battle cards. Here's exactly how to fix it."
I cut 70% of my planned content. I went deep on one framework with multiple real examples.
Average rating went from 3.2 to 4.6.
Make it immediately actionable.
Instead of sharing frameworks conceptually, I gave attendees:
- Templates they could download during the session
- A 7-day implementation plan
- Scripts they could copy-paste
People want to leave with something they can use Monday morning, not vague advice.
Show, don't tell.
Instead of describing how our competitive scorecard works, I showed:
- Screenshots of the actual scorecard
- Real data from our program
- Before/after examples of battle cards
Specific examples are more valuable than abstract explanations.
Build in Q&A time.
I used to save Q&A for the last 5 minutes and always ran out of time.
Now I build Q&A into the session:
- 25 minutes presentation
- 10 minutes Q&A
- 10 minutes deeper dive on the most-asked question
This structure gives space for the conversation attendees actually want.
Share the failures, not just the wins.
Early on, I only shared success stories. "We did X, and it worked great."
Now I share failures:
- "Here's what we tried first that completely failed"
- "This approach seemed smart but didn't work because..."
- "If I could redo this, I'd skip X entirely"
Attendees trust failure stories more than success stories. They learn more from them too.
The result: Session ratings went from 3.2 to 4.6. Attendees left with templates, plans, and specific actions. Post-session LinkedIn messages went from 2-3 to 30+.
The Pipeline Impact That Made Speaking Worth It
The CMO's question when I first proposed speaking at conferences: "Is this worth your time? What's the ROI?"
Fair question. Speaking at a conference takes 20-40 hours of prep, travel time, and opportunity cost.
After my first accepted speaking slot, here's what we tracked:
Direct leads from the session:
- 340 people attended the session
- 180 scanned a QR code to download our template
- 78 filled out a "contact me" form for follow-up
- 24 became qualified opportunities
- 6 closed deals ($340K in revenue)
ROI on direct leads: Positive, but not amazing. The travel and prep cost was ~$8K (time + expenses). $340K in revenue was good, but not a game-changer.
Indirect brand impact:
- 40+ LinkedIn connection requests from attendees
- 15 podcast/webinar invitations
- Conference organizers invited me to speak at their other events
- Sales reps mentioned "I saw you speak at SaaStr" in outreach (social proof)
Long-term thought leadership:
- Session recording has 4,200 views on YouTube
- Template has been downloaded 2,100+ times
- Positioned our company as experts in competitive intelligence
Total influenced pipeline: $1.2M within 6 months of the session.
The insight: Speaking's value isn't just the immediate leads. It's the long-term brand positioning and credibility.
When prospects Google our company, they find:
- Our founder speaking at major conferences
- High-value content and templates
- Recognition as thought leaders
This builds trust faster than any sales pitch could.
Conference speaking is also a recruiting tool:
After speaking at conferences, I got:
- 12 job applications mentioning the session
- 3 amazing hires who said they applied because they saw us speak
- Passive candidates engaging with our content
Conference speaking isn't just demand gen. It's brand-building, recruiting, and long-term credibility.
What Actually Works for Landing Conference Speaking Slots
After going from 5 rejections to consistent acceptances, here's what works:
Lead with a specific problem, not your company's story. "Why [problem] is failing and how to fix it" beats "How we built [thing]" every time.
Promise tactical takeaways. Templates, frameworks, scripts. Not best practices or trends.
Be opinionated. "Most companies are doing this wrong, and here's why" creates tension and interest.
Build thought leadership before submitting. Write publicly, speak at smaller events, connect with conference organizers.
Start with smaller conferences. Build your speaking resume before applying to the big stages.
Design sessions for attendees, not for you. Go deep on one thing. Make it actionable. Share failures.
Measure both short-term leads and long-term brand impact. Speaking's ROI isn't just immediate pipeline—it's 6-12 months of credibility building.
I went from five rejected proposals to speaking at three major conferences this year. Same content expertise. Different proposal strategy.
Conference organizers don't want product pitches or generic best practices.
They want controversial opinions, tactical frameworks, and sessions that solve real problems.
Give them that, and you'll get the speaking slot.
Then deliver a session attendees actually want to attend, and you'll get invited back.