Part 2 of 7 in Specification Factory
The Spec Gap: What Engineering Sees That Product Doesn't
A Simple Story
Let's say you're a PM at an insurance company. You write this user story:
As a policyholder I want to submit a claim online So that I can get reimbursed faster
You add some acceptance criteria:
- User fills out claim form
- System validates the claim
- User receives confirmation
This goes into grooming. Engineering asks good questions. You clarify a few things. Story gets estimated. Sprint starts.
Two weeks later, you're in sprint review and discover the team made 47 decisions you didn't know needed to be made.
The 47 Hidden Decisions
Let me show you what I mean. Here are just some of the decisions engineering had to make:
Authentication & Authorization
- Can users submit claims while unauthenticated, or must they log in first?
- Can a user submit a claim for someone else's policy?
- What if the policy is expired but the incident happened while it was active?
- Can corporate admins submit claims on behalf of policyholders?
Form Validation
- Which fields are required vs optional?
- What formats are acceptable for dates? (MM/DD/YYYY? ISO 8601? Natural language?)
- What's the maximum claim amount we'll accept?
- Do we validate claim amounts against policy limits, or just accept anything?
- Should we prevent duplicate submissions if a user clicks twice?
- What happens if a user starts a claim but doesn't finish it?
File Uploads
- What file types can be attached? (Images? PDFs? Videos?)
- What's the maximum file size?
- How many attachments can be included?
- What if a user's photo is too large—do we resize it or reject it?
- Do we scan uploads for viruses?
Data Relationships
- Which policy should this claim reference if a user has multiple policies?
- What if the policy number entered doesn't exist in our system?
- Should we look up the policy details automatically or require manual entry?
- Do we need to lock the claim record if multiple adjusters try to review it simultaneously?
Business Rules
- Can claims be submitted retroactively? How far back?
- Can users edit a claim after submission?
- What statuses can a claim have? (Draft, Submitted, Under Review, Approved, Denied, Paid?)
- Who gets notified when a claim is submitted?
- Do we generate a claim number immediately or after manual review?
Error Handling
- What error message do we show if validation fails?
- What if our payment processing service is down?
- What if the database write fails partway through—do we retry or rollback?
- What if the user's session expires while they're filling out the form?
- Do we log failed submission attempts? For how long?
Performance & SLOs
- What's an acceptable response time for claim submission?
- Should we process this synchronously or queue it?
- Do we need to support concurrent submissions from the same user?
- What if 1,000 users try to submit claims at the same time?
Telemetry & Analytics
- What events should we track? (Page views? Form starts? Field completions? Errors?)
- Do we track how long users spend on each form section?
- Should we send analytics to a third-party service?
- What PII can be included in telemetry?
Compliance
- Does HIPAA apply to this data?
- Do we need audit logs of who accessed or modified claims?
- What data retention policies apply?
- Can we store claim data in the cloud or must it be on-premises?
- Do we need consent checkboxes?
UI/UX Details
- Should form fields auto-save as the user types?
- Do we show a progress indicator?
- Should the confirmation page include a printable summary?
- Do we email a copy of the confirmation to the user?
- Can users submit claims via mobile app, or web only?
What Actually Happened
In a typical sprint, here's how those 47 decisions got made:
- 12 decisions were made in grooming after asking clarifying questions (but you were in another meeting for 3 of them)
- 19 decisions were made by the tech lead based on "what we did last time"
- 8 decisions were made by the developer implementing the feature, based on gut feel
- 5 decisions were escalated back to you mid-sprint via Slack, interrupting strategic work
- 3 decisions were made by QA when they found edge cases during testing
And nobody documented which decisions were made or why.
The Real Cost
This isn't just inefficient. It's expensive:
Rework
Six months later, a different team builds "Submit Medical Records" and makes conflicting decisions about file uploads. Now you have two different upload UX patterns in the same product.
Bugs
The developer assumed "System validates the claim" meant "check if required fields are filled." They didn't realize claims over $10,000 require additional documentation per company policy. Claims get submitted, approved, and paid incorrectly.
Compliance Issues
The team didn't know HIPAA applied. The implementation logs full claim details including medical information. You discover this during a security audit.
Technical Debt
Because decisions were made ad-hoc instead of systematically, the codebase is full of inconsistent patterns. Each feature is implemented slightly differently. Onboarding new developers takes weeks longer than it should.
Opportunity Cost
You spent 2 hours in grooming, 3 hours answering mid-sprint questions, and 1 hour in a war room debugging why claims over $100,000 were failing. That's 6 hours you could have spent talking to customers or analyzing market trends.
The Traditional "Solution"
The standard advice is: "Write better specs."
Make your acceptance criteria more detailed. Document edge cases. Create flow diagrams. Write API contracts. Add performance requirements.
But this just pushes PMs further into heads-down work. You're no longer a strategist—you're a technical writer.
And even with exhaustive specs, you can't anticipate everything. Requirements evolve. Business rules change. The spec you wrote in January is outdated by March.
What If There Was a Better Way?
What if, instead of manually filling the spec gap, you could:
- Define strategic intent - "We need to let policyholders submit claims online"
- Let a Specification Factory formalize it - AI generates complete specifications with all 47 decisions documented
- Review, don't write - You spend 20 minutes reviewing and refining instead of 8 hours writing from scratch
- Let validation catch the rest - Semantic validation ensures completeness, catches ambiguity, enforces consistency with existing services
This isn't about removing human judgment. It's about amplifying it. You exercise judgment on the hard decisions (business logic, user experience, compliance) while automation handles the mechanical work (documenting edge cases, validating consistency, generating artifacts).
Next up: The Specification Factory - We'll show you exactly how this works and what the workflow looks like.
This is Part 2 of the "Look Up" series exploring how AI is finally freeing product managers to do their best work.
Specification Factory
Part 2 of 7
The Best Product Managers Are Looking Down

The Specification Factory: Having Your Cake and Eating It Too
View all posts in this series
- 1.The Best Product Managers Are Looking Down
- 2.The Spec Gap: What Engineering Sees That Product Doesn't
- 3.The Specification Factory: Having Your Cake and Eating It Too
- 4.Introducing Chronos: A Language for Product Intent
- 5.The Full Stack of Intent: From Customer Problem to Production Code
- 6.Starting Small: The 30-Minute Pilot That Sells Itself
- 7.The Strategic PM: What Product Management Looks Like After the Specification Factory
