Design Patterns
WorkflowIntermediate

PM-Developer AI Collaboration

A structured approach for Product Managers and Developers to collaborate effectively when using AI-assisted development, with clear handoffs and defined boundaries.

Overview

PM-Developer AI Collaboration establishes a framework for effective teamwork between Product Managers and Developers in AI-assisted development environments. By creating structured handoff documents with clear AI implementation boundaries, using Gherkin acceptance criteria for automated test generation, and defining explicit Human In The Loop review checkpoints, teams can maximize AI coding efficiency while maintaining product quality and alignment.

Problem

Traditional PM-Developer handoffs are not optimized for AI-assisted development, leading to common challenges:

  • Requirements documents lack the structure needed for AI to understand implementation scope
  • Unclear boundaries between what AI should implement versus what requires human judgment
  • Communication gaps between PM requirements and AI capabilities cause rework
  • No clear checkpoints for validating AI-generated code against business intent
  • Acceptance criteria formats don't support automated test generation from AI
  • PMs don't know how to structure requirements for optimal AI consumption

Solution

Implement a structured collaboration workflow designed specifically for AI-assisted development:

For Product Managers (as Context Architect):

  • Create AI-ready requirements using standardized templates — these function as Live Specs in the handbook framework
  • Mark explicit AI Implementation Boundaries in requirements (the Hybrid Engineering pillar's task routing)
  • Write acceptance criteria in Gherkin format for automated test generation via the Eval Harness
  • Define Human In The Loop review checkpoints in the workflow

For Developers (as Agent Operator):

  • Review requirements for AI suitability before implementation
  • Identify components that require human judgment vs AI generation — applying the Hybrid Engineering pillar
  • Set up iterative review loops with AI as part of the Continuous Development Loop
  • Document AI-generated code for future maintainability

For the Team:

  • Establish shared vocabulary for AI boundaries
  • Create feedback loops between PM intent and AI output
  • Build a library of successful AI implementation patterns

Implementation

1

2

3

4

5

6

Code Examples

Gherkin Acceptance Criteria Example
Feature: User Registration
  As a new user
  I want to create an account
  So that I can access personalized features

  # AI Implementation Boundary: AI-SUITABLE
  Scenario: Successful registration with valid email
    Given I am on the registration page
    When I enter a valid email "user@example.com"
    And I enter a password "SecurePass123!"
    And I click the "Register" button
    Then I should see a confirmation message
    And I should receive a verification email

  # AI Implementation Boundary: HUMAN-REQUIRED
  Scenario: Registration with company SSO
    Given my company has SSO configured
    When I click "Sign in with SSO"
    Then I should be redirected to my company's identity provider

Gherkin format provides clear acceptance criteria that AI can use to generate tests. Boundary markers help developers understand which scenarios AI can fully implement.

Considerations

Benefits
  • Faster development cycles with clear AI boundaries
  • Reduced rework from misaligned expectations
  • Better test coverage through Gherkin-based acceptance criteria
  • Improved PM-Developer communication with shared vocabulary
  • More predictable AI implementation outcomes
  • Documentation of successful patterns for team learning
  • Clearer accountability for human vs AI decisions
Challenges
  • Initial overhead of learning structured templates
  • Requires PM education on AI capabilities and limitations
  • AI boundaries may need adjustment as AI tools evolve
  • Some features don't fit neatly into boundary categories
  • Team resistance to changing established handoff processes
  • Overhead of checkpoint reviews may slow initial velocity
  • Maintaining boundary documentation as requirements change