SKILL.md

Technical Specification Review

Overview

Review technical specifications through systematic analysis to surface issues before implementation. This is Stage 1 validation that must happen before planning (Stage 2). Output identifies what needs fixing, not how to build it.

Review Process

Execute these steps in order:

1. Establish Context

Read the specification completely. Understand the problem domain, stated requirements, and success criteria.

Identify the spec type:

  • API specification (endpoints, contracts, auth)
  • Product specification (features, user flows, acceptance criteria)
  • Architecture specification (components, integration points, data flows)
  • System specification (infrastructure, deployment, scaling)

Ask clarifying questions:

  • What problem is this solving?
  • Who are the end users or consumers?
  • What are the constraints (technical, business, regulatory)?
  • What is explicitly out of scope?

2. Requirements Decomposition

Break the spec into discrete, testable requirements.

Check each requirement for:

  • Specificity: Is it concrete enough to build and test?
  • Measurability: Can you verify when it's done?
  • Feasibility: Is it technically possible with stated constraints?

Flag vague requirements:

  • "Should be fast" (how fast?)
  • "User-friendly interface" (what makes it user-friendly?)
  • "Scalable architecture" (to what load, what metrics?)
  • "Secure" (what threat model, what standards?)

Expose conflated requirements: Requirements that bundle outcomes with implementation details. Example: "Use Redis for session management to improve performance" should be "Session management must support X concurrent users with <100ms latency" (outcome) separate from implementation choice.

3. Identify Unstated Assumptions

Surface hidden assumptions that affect implementation.

Look for implicit dependencies:

  • "Users can authenticate" (assumes auth system exists)
  • "Data will be validated" (what validation rules?)
  • "Third party API available" (SLA, rate limits, auth method?)

Question architectural assumptions:

  • Assumed data formats, schemas, or protocols
  • Assumed infrastructure or deployment environment
  • Assumed team capabilities or existing systems
  • Assumed performance characteristics

Validate business logic assumptions:

  • Edge cases not addressed (empty states, max limits, failures)
  • User behavior assumptions (will users actually do X?)
  • Workflow assumptions (can steps be reordered, skipped, failed?)

Reference references/assumption-categories.md for systematic coverage.

4. Check Completeness

Identify missing information required for implementation.

Common gaps:

  • Error handling: What happens when things fail?
  • State management: How is state tracked, persisted, recovered?
  • Data lifecycle: Creation, updates, deletion, archival
  • Integration points: How does this connect to existing systems?
  • Security model: Authentication, authorization, data protection
  • Observability: Logging, metrics, tracing, alerting
  • Performance requirements: Latency, throughput, resource limits
  • Backwards compatibility: Migration path, deprecation strategy

Domain-specific completeness checks: Reference references/completeness-checklist.md for category-specific checks.

5. Detect Contradictions

Find conflicting requirements or constraints.

Types of contradictions:

  • Internal: Two requirements that cannot both be true
    • Example: "Must complete in <100ms" AND "Must make 5 sequential API calls" (if each call takes 50ms)
  • Constraint conflicts: Requirements that violate stated constraints
    • Example: "Process 10GB files" in "512MB memory environment"
  • Scope conflicts: Features that contradict "out of scope" statements
  • Temporal conflicts: Ordering requirements that cannot be satisfied
    • Example: "User sees result before computation completes"

6. Validate Technical Feasibility

Assess whether the spec is actually buildable.

Technical red flags:

  • Performance requirements that exceed known limitations
  • Integration with systems that don't exist or lack required APIs
  • Data transformations that lose required information
  • Security requirements without implementation mechanism
  • Scale requirements without infrastructure plan

Challenge the "should"s: When the spec says "X should work", determine: Is this mandatory or optional? Does the uncertainty hide missing requirements?

7. Assess Problem-Solution Fit

Question whether the spec solves the right problem.

Critical questions:

  • Does this spec address the actual user pain point?
  • Are there simpler solutions not considered?
  • Does this create new problems while solving the stated one?
  • Are there unstated goals this is trying to achieve?
  • What assumptions about users or workflows might be wrong?

This step requires judgment. Don't bikeshed, but do surface when the spec seems misaligned with stated goals.

Output Format

Structure findings to enable spec revision:

# Specification Review: [Spec Name]

## Executive Summary
[2-3 sentences: overall assessment, critical issues, recommendation]

## Critical Issues (Must Fix)
Issues that make the spec unbuildable or wrong:

### [Issue Category]
**Finding**: [What's wrong]
**Impact**: [Why this matters]
**Required**: [What needs to be added/changed]

## Significant Gaps (Should Fix)
Issues that will surface during implementation:

### [Issue Category]  
**Finding**: [What's missing]
**Impact**: [What happens if not addressed]
**Recommendation**: [What to add]

## Questions for Clarification
Ambiguities that need resolution:

1. [Question about requirement/assumption/scope]
2. [Question about technical feasibility/constraint]

## Recommendations
Structural or approach improvements:
- [Recommendation with brief rationale]

## Assessment
Ready for Planning: [Yes/No with brief reason]
Estimated Revision Effort: [Low/Medium/High]

Reference Materials

references/assumption-categories.md

Systematic taxonomy of assumption types to check: technical, business, user, integration, environmental, operational.

references/completeness-checklist.md

Domain-specific completeness checks for API specs, product specs, architecture specs, and system specs.

Notes

This is Stage 1 (Review), not Stage 2 (Planning).
Do not include implementation suggestions, task breakdowns, timelines, or technical design decisions. Those belong in planning after the spec is validated.

Be direct about problems.
Specs exist to be challenged. Surface issues clearly without sugar-coating.

Focus on prevention.
Every issue found here prevents wasted implementation effort or mid-project scope changes.