Building a PMM knowledge base (lessons from failed attempts)

Kris Carter Kris Carter on · 9 min read
Building a PMM knowledge base (lessons from failed attempts)

Our first three attempts at a PMM knowledge base failed completely. Nobody used them. The fourth attempt finally worked. Here's what we learned.

New PMM: "Where can I find our competitive analysis on Competitor X?"

Me: "Check the Google Drive folder... or maybe it's in Notion... actually, Sarah might have the latest version in her personal docs."

New PMM: "Which Google Drive folder?"

Me: "Uh... I think it's in 'Competitive Intel' or maybe 'Market Research'... just search for it."

Ten minutes later, they'd found three different versions of the competitive analysis, none of them current. Nobody knew which was the source of truth.

This happened constantly. Critical PMM knowledge lived in:

  • Someone's head (undocumented)
  • Random Google Docs (scattered)
  • Old Slack threads (unsearchable)
  • Notion pages (inconsistent structure)

Finding information took longer than creating it. New hires spent weeks asking "where is...?" questions. We redid work because we couldn't find existing research.

I tried building a knowledge base three times. All three failed.

Attempt 1: Comprehensive wiki in Confluence. Too complex, nobody maintained it.

Attempt 2: Organized folder structure in Google Drive. Became messy within weeks.

Attempt 3: Everything in Notion. People didn't know what to put where.

The fourth attempt finally worked. Not because the tool was better—because we changed our approach.

Here's what actually works for PMM knowledge bases, and why most attempts fail.

Why Attempt 1 Failed: The Comprehensive Wiki

The idea: Build a complete reference encyclopedia of all PMM knowledge in Confluence.

The structure:

PMM Wiki
├── Competitive Intelligence
│   ├── Competitor Profiles
│   ├── Market Analysis
│   └── Battle Cards
├── Customer Research
│   ├── Persona Development
│   ├── Interview Findings
│   └── Survey Results
├── Product Positioning
│   ├── Messaging Framework
│   ├── Value Propositions
│   └── Use Cases
├── Launch Management
│   ├── Launch Process
│   ├── Past Launches
│   └── Templates
└── [15 more top-level categories]

Why I thought it would work: Comprehensive. Organized. Everything in one place.

Why it failed:

Problem 1: Too much structure

20 top-level categories. 100+ subcategories. People didn't know where things belonged.

Should customer interview findings go under "Customer Research → Interview Findings" or "Product Positioning → Research"? Both made sense.

Analysis paralysis. People would give up and just save stuff in Google Drive instead.

Problem 2: High friction to contribute

Adding content to the wiki required:

  1. Navigate to right category
  2. Create new page
  3. Apply proper template
  4. Tag and categorize
  5. Link to related pages

Too many steps. When PMMs finished a project, they'd skip documentation because it felt like extra work.

Problem 3: Nobody maintained it

Week 1: Wiki was pristine and organized.

Month 3: Half the content was outdated. Links were broken. Nobody owned maintenance.

Month 6: People stopped using it because they didn't trust the content was current.

The lesson: Comprehensiveness doesn't matter if nobody contributes or maintains it.

Why Attempt 2 Failed: The Organized Folder Structure

The idea: Simplify. Just use Google Drive with clear folder structure.

The structure:

PMM Team Drive
├── 01_Competitive_Intelligence
├── 02_Customer_Research
├── 03_Product_Positioning
├── 04_Launches
├── 05_Sales_Enablement
├── 06_Templates
├── 07_Team_Docs
└── 08_Archive

Why I thought it would work: Everyone knows how to use Google Drive. Low friction. No training needed.

Why it failed:

Problem 1: Search doesn't work well

Google Drive search finds every document with the keyword, not the right document.

Search "competitor analysis" → 47 results.

Which one is current? Which one is the official version? No idea.

Problem 2: Folders became dumping grounds

The "Competitive Intelligence" folder had 200+ files within three months.

Files named:

  • "Competitor X Analysis v2"
  • "Competitor X Analysis FINAL"
  • "Competitor X Analysis FINAL v3"
  • "Competitor X Analysis Sept 2024"

Nobody knew which was current.

Problem 3: No version control or status indication

Is this research from last year still accurate? Who knows.

Is this battle card official or draft? No indication.

Has this been reviewed or is it someone's rough notes? Can't tell.

The lesson: Low friction to contribute creates chaos without governance.

Why Attempt 3 Failed: Everything in Notion (Wrong Approach)

The idea: Migrate everything to Notion. Better organization than folders. Better search than Google Drive.

What we built:

Central "PMM Hub" in Notion with databases for:

  • Competitive intel
  • Customer research
  • Product positioning
  • Launch tracking
  • Templates

Why I thought it would work: Notion is flexible, searchable, and team members already used it.

Why it failed:

Problem 1: No clear mental model

People didn't understand the difference between "Add to database" vs. "Create new page" vs. "Add to existing page."

They'd create content in the wrong place, or duplicate pages, or abandon Notion and revert to Google Docs.

Problem 2: Information sprawl

Some content in databases. Some in standalone pages. Some embedded in other pages. Some linked from external docs.

Instead of one source of truth, we had organized chaos.

Problem 3: Lack of ownership

Nobody owned "making sure Notion stays organized." It became a dumping ground like Google Drive, just in a prettier interface.

The lesson: The tool doesn't matter if you don't have clear processes and ownership.

What Actually Worked: The Living Knowledge Base (Attempt 4)

The shift in thinking:

Stop trying to capture everything. Start with what people actually need.

The new principle: Knowledge base should answer the questions people ask most, not document everything we know.

Step 1: Identify the top 20 questions

I surveyed the team: "What information do you search for most often?"

Top recurring questions:

  1. Where is our latest battle card for [Competitor]?
  2. What's our positioning for [Product]?
  3. How do we describe [Feature] to customers?
  4. What customer research exists on [Topic]?
  5. What were the results of [Past Launch]?
  6. Where is the template for [Deliverable]?
  7. What's our process for [Workflow]?
  8. Who owns [Area]?
  9. When was [Content] last updated?
  10. What competitive intelligence do we have on [Competitor]?

Step 2: Build structure around those questions

Instead of comprehensive categories, I built answer pages for each common question.

Example: "Where is our latest battle card for Competitor X?"

Answer page structure:

Title: Competitor X Battle Card Last Updated: [Date] Owner: [Name] Status: Current / Under Review / Archived

Quick Access:

  • [Link to current battle card]
  • [Link to competitive intel database for Competitor X]
  • [Link to win/loss data vs. Competitor X]

Recent Updates:

  • [Log of changes with dates]

This answered the question in 5 seconds instead of requiring navigation through folders.

Step 3: Use databases only for structured data

Not everything needs a database. Use them strategically:

Databases:

  • Competitive intel (structured comparison across competitors)
  • Customer research repository (filterable by persona, topic, date)
  • Launch tracker (status of all launches)

Simple pages:

  • Messaging and positioning (versioned documents)
  • Process documentation (workflows and templates)
  • Team information (roles, responsibilities, calendars)

Step 4: Build in maintenance rituals

Monthly audit:

  • Review top 20 answer pages
  • Update "Last Reviewed" dates
  • Flag outdated content
  • Archive what's no longer relevant

Ownership:

  • Each major section has an owner
  • Owner responsible for keeping content current
  • Quarterly rotation to prevent knowledge silos

Update triggers:

  • Product launch → Update positioning pages
  • Competitive move → Update battle cards
  • Research complete → Add to research repository

Step 5: Make it the default, not optional

The rule: If it's not in the knowledge base, it doesn't exist officially.

New competitive analysis? Doesn't count until it's in the knowledge base.

Launch retrospective insights? Not official until documented.

Process change? Not implemented until knowledge base reflects it.

This created forcing function for contribution.

The Structure That Actually Works

Our final Notion knowledge base:

PMM Knowledge Base
├── Quick Answers (Top 20 questions with direct links)
├── Competitive Intelligence Database
├── Customer Research Repository
├── Product Messaging Hub
├── Launch Tracker and Retrospectives
├── Process Documentation
├── Templates Library
└── Team Resources

Quick Answers = Most-used section. 80% of searches end here.

Databases = Structured information that benefits from filtering and views.

Hubs = Living documents that get updated frequently (messaging, positioning).

Archives = Old content that's searchable but not actively maintained.

The Contribution Workflow That Prevented Chaos

Before: People created content wherever they wanted.

After: Clear workflow for adding knowledge.

Adding competitive intelligence:

  1. Complete competitive research
  2. Add entry to Competitive Intelligence Database (structured fields)
  3. Link from Quick Answers page for that competitor
  4. Update "Last Updated" field
  5. Post in #competitive-intel Slack channel with link

Adding customer research:

  1. Complete interviews/research
  2. Create research summary document
  3. Add to Customer Research Repository with tags
  4. Link key insights to relevant positioning pages
  5. Post in #pmm-team with summary and link

Adding launch learnings:

  1. Complete launch retrospective
  2. Document key learnings in Launch Tracker entry
  3. Update relevant process docs if workflow changes
  4. Archive launch materials in appropriate folder
  5. Share insights in next team meeting

The key: Every contribution follows a pattern. Reduces cognitive load. Creates consistency.

The Search Strategy That Made Knowledge Findable

Search in Notion is powerful but requires setup:

Tagging strategy:

  • Product names (consistent spelling)
  • Competitor names (full company names, not shortcuts)
  • Content type (research, battle card, positioning, process)
  • Status (current, draft, archived)
  • Owner (who's responsible)

Example:

Battle card tagged:

  • Competitor: Acme Corporation
  • Content Type: Battle Card
  • Status: Current
  • Owner: Sarah
  • Last Updated: 2024-12-01

Now searchable by any of those dimensions.

The Quick Answers index:

Every answer page includes:

  • Primary question it answers
  • Related questions
  • Links to detailed information
  • Last updated date

This creates multiple entry points. If you don't find it searching one way, you'll find it another.

The Governance That Kept It Clean

Quarterly knowledge base audit:

Week 1: Review all "Current" content

  • Is it actually current or outdated?
  • Move outdated content to "Archived" status
  • Delete content that's no longer relevant

Week 2: Review ownership assignments

  • Does every major section have an owner?
  • Are owners actually maintaining their areas?
  • Reassign orphaned sections

Week 3: Review usage analytics

  • Which pages get viewed most?
  • Which get viewed never?
  • Focus on high-value content, archive low-value

Week 4: Implement improvements

  • Restructure confusing sections
  • Add missing answer pages for new common questions
  • Update templates based on team feedback

Continuous maintenance:

  • Every Monday: Check "Needs Update" flagged pages
  • Every month: Review top 10 most-accessed pages
  • After each launch: Update launch tracker and learnings
  • After competitive moves: Update battle cards

For Teams Managing Growing PMM Knowledge

As PMM teams scale, knowledge management becomes critical to operational efficiency. Some teams find that consolidating knowledge bases with operational workflows creates better adoption—when competitive intelligence, launch documentation, and messaging updates live in the same system teams actually use for daily work rather than a separate wiki. Platforms like Segment8 demonstrate how integrated approaches can reduce the friction between creating knowledge and storing it—making documentation a natural byproduct of PMM workflows rather than extra work.

What Good Knowledge Management Actually Delivered

Before knowledge base:

Time to find information: 15-30 minutes (often unsuccessful)

Redundant work: Redoing research because we couldn't find existing work (20% of projects)

New hire ramp time: 8-10 weeks (lots of time asking where things are)

Version confusion: Multiple versions, unclear which is official (constant problem)

After knowledge base:

Time to find information: 2-5 minutes (successful 90%+ of time)

Redundant work: Rare (<5% of projects)

New hire ramp time: 5-6 weeks (self-service onboarding via knowledge base)

Version confusion: Single source of truth, clear "Last Updated" dates

Impact:

Time savings: ~20 hours/month across team (finding information faster)

Quality improvement: Less redundant work, building on existing knowledge

Onboarding efficiency: 30-40% faster ramp for new hires

The Uncomfortable Truth About Knowledge Bases

Most PMM knowledge bases fail not because of the tool but because teams don't commit to maintenance.

The failure pattern:

Month 1: Build comprehensive knowledge base. Everyone excited.

Month 3: Content getting added but organization degrading.

Month 6: Half the content outdated. Nobody maintains it.

Month 12: Everyone reverts to asking people instead of searching.

The successful pattern:

Month 1: Build minimal knowledge base answering top 20 questions.

Month 3: Refine based on usage. Add new answer pages for common questions.

Month 6: Quarterly audit keeps content fresh. Team trusts it.

Month 12: Knowledge base is the default source of truth. Self-reinforcing.

The teams that make knowledge bases work:

  • Start small (top questions, not comprehensive encyclopedia)
  • Clear contribution workflows (how to add content)
  • Assigned ownership (each section has a maintainer)
  • Regular maintenance (quarterly audits, monthly reviews)
  • Usage enforcement (if it's not in KB, it's not official)

The teams where knowledge bases fail:

  • Build comprehensive wikis nobody maintains
  • No clear processes for adding content
  • No ownership or accountability
  • One-time setup, no ongoing maintenance
  • Knowledge base is optional, not required

Our fourth attempt succeeded because:

  • We focused on answering questions, not documenting everything
  • We built clear contribution workflows
  • We assigned owners to maintain sections
  • We audited quarterly to keep content fresh
  • We made it the single source of truth

Start small. Answer the top 20 questions your team asks. Build contribution workflows. Assign owners. Maintain regularly.

That's a knowledge base that works.

Not a comprehensive wiki that nobody uses.

Not a folder dumping ground nobody can search.

Not a tool that starts organized and becomes chaos.

A living knowledge base that's actually used, maintained, and valuable.

Build the answer pages. Create the workflows. Assign the owners. Do the maintenance.

Your team will stop wasting hours searching for information. Your new hires will ramp faster. Your knowledge will compound instead of disappearing.

Build a knowledge base that works.

Kris Carter

Kris Carter

Founder, Segment8

Founder & CEO at Segment8. Former PMM leader at Procore (pre/post-IPO) and Featurespace. Spent 15+ years helping SaaS and fintech companies punch above their weight through sharp positioning and GTM strategy.

Ready to level up your GTM strategy?

See how Segment8 helps GTM teams build better go-to-market strategies, launch faster, and drive measurable impact.

Book a Demo