Advanced Developer Guide

Stop treating requirements as text. Treat them as data. This guide covers how to use ProdMoh MCP to automate architecture decisions.

hub Under the Hood: Architecture

Unlike standard REST APIs where your IDE polls for data, ProdMoh MCP utilizes Server-Sent Events (SSE) for a persistent, unidirectional stream.

school Fundamentals & AI Native Concepts

To get the most out of ProdMoh MCP, it helps to understand the AI concepts driving this technology. It is more than just an API connection; it is a new way to manage LLM Context.

1. What is MCP? (The Protocol)

The Model Context Protocol (MCP) is an open standard backed by Anthropic. It solves the "Data Silo" problem for AI.

The Old Way (Copy-Paste)

You manually copy text from a website and paste it into ChatGPT.

❌ Stale Data
❌ Limited by Clipboard
❌ No Security Boundary

The MCP Way (Context Server)

The AI "asks" the server for data only when needed.

✓ Real-time Sync
✓ Unlimited Knowledge Base
✓ Tokenized Access Control

2. AI-Native Concept: "Just-in-Time" RAG

You may know RAG (Retrieval-Augmented Generation) as "vector databases." ProdMoh MCP introduces Dynamic RAG.

  • Context Window Management: LLMs (like Claude 3.5 or GPT-4o) have a limit on how much text they can read (the context window). If you paste an entire 50-page PRD, you fill the window with noise. MCP allows the AI to fetch only the specific user story relevant to the code you are currently writing.
  • Grounding: By forcing the AI to reference a specific Resource URI (e.g., prodmoh://req/102), we "ground" the model. This significantly reduces hallucinations because the AI is restricted to facts provided by the server.

3. Strategic Advantages for Teams

auto_awesome
Model Agnostic
Because MCP is a standard, your ProdMoh integration works seamlessly whether your team uses Claude, OpenAI, or open-source models via Ollama. You don't need to build separate integrations for each AI tool.
verified_user
The "Source of Truth" Pipeline
In traditional dev, the "truth" is lost in Slack messages and verbal calls. With MCP, the code generation pipeline is physically connected to the signed-off Requirement Document. If the doc changes, the code generation changes.

category The MCP Primitives

To understand how ProdMoh interacts with your IDE, we must define the three atomic primitives of the Model Context Protocol. In Computer Science terms, these represent State (Resources), Execution (Tools), and Instruction (Prompts).

1. Resources (Passive Context)

Definition: Data that can be read by the client (IDE). Think of these like GET endpoints or file handles.

In ProdMoh: Every User Story, PRD, and Research Document is exposed as a Resource.

URI Scheme: prodmoh://stories/{id}/read
2. Tools (Executable Functions)

Definition: Functions the AI can call to perform an action or retrieve dynamic calculations. This is often called "Tool Use" or "Function Calling" in LLM literature.

In ProdMoh: We expose tools to search your knowledge base dynamically.

Function: search_requirements(query: string)
3. Prompts (Templated Context)

Definition: Pre-defined instructions that orchestrate complex interactions between the user, the context, and the model.

In ProdMoh: We provide "System Prompts" that automatically format your PRDs into code-ready instructions (e.g., "Scaffold Component from PRD").

code Protocol Anatomy: JSON-RPC

At the transport layer, MCP uses JSON-RPC 2.0. This stateless, lightweight protocol allows the IDE and the ProdMoh Server to exchange messages over the SSE stream.

When you ask your IDE to read a requirement, the following handshake occurs on the wire:

1. The Request (Client → Server)

The IDE requests a list of available resources to populate its context menu.

{
  "jsonrpc": "2.0",
  "id": 1,
  "method": "resources/list",
  "params": {}
}

2. The Response (Server → Client)

ProdMoh responds with a structured list of your documents.

{
  "jsonrpc": "2.0",
  "id": 1,
  "result": {
    "resources": [
      {
        "uri": "prodmoh://prd/102/checkout-flow",
        "name": "PRD: Checkout Flow V2",
        "mimeType": "text/markdown",
        "description": "Updated requirements for stripe integration"
      }
    ]
  }
}
lightbulb
Why this matters: Because we expose a standard mimeType (text/markdown), the AI model immediately understands how to parse the structure (headers, bullet points) of your document without any custom parsers on your machine.
The Connection Lifecycle
  1. Handshake: Your IDE initiates a connection to /mcp/sse via GET.
  2. Transport: ProdMoh upgrades the connection to an EventStream.
  3. Query: When you type @ProdMoh, the IDE sends a JSON-RPC payload via POST to /mcp/messages.
  4. Context Injection: The server parses the specific PRD section relevant to your query and injects it directly into the LLM's system prompt context window.
info
Why SSE? This allows us to push updates instantly. If a PM changes an Acceptance Criterion while you are coding, your next prompt will immediately reflect that change without re-fetching.

settings Configuration

Add this configuration to your global or project-level settings file.

File Path: ~/.cursor/mcp.json or ~/.vscode/mcp.json

{
  "mcpServers": {
    "prodmoh": {
      "type": "sse",
      "url": "https://prodmoh.com/sse",
      "headers": {
        "x-prodmoh-token": "YOUR_TOKEN_HERE"
      }
    }
  }
}

smart_toy ChatGPT Custom GPT Integration

Connect ProdMoh to your Custom GPT in the ChatGPT GPT Store. This allows ChatGPT to access your PRDs, user stories, and product context during conversations.

✨ Option 1: OAuth (Recommended)

Users authenticate seamlessly via ProdMoh login. No manual API key copying needed.

For GPT Developers

In the GPT Builder → Configure → Actions → Authentication:

  1. Select OAuth
  2. Client ID: chatgpt-prodmoh
  3. Authorization URL: https://prodmoh.com/oauth/authorize
  4. Token URL: https://prodmoh.com/oauth/token
  5. Scope: read
User Experience: When users install your GPT, they'll see a "Login with ProdMoh" popup, authenticate, and automatically be connected.
Option 2: API Key (Manual)

Users manually copy their API key from the ProdMoh dashboard.

Step 1 — Generate Your API Key

  1. Open your ProdMoh Dashboard
  2. Click IDE Assistant in the sidebar
  3. Click Generate Access Token
  4. Copy the ChatGPT API Key shown at the top

Step 2 — Configure ChatGPT Actions

In the GPT Builder:

  1. Go to ConfigureActions
  2. Click Authentication
  3. Select API Key
  4. Set Auth Type to Bearer
  5. Paste your API key from ProdMoh

Step 3 — Add the OpenAPI Schema

In the Actions section, add this server URL:

https://prodmoh.com
lightbulb
OAuth vs API Key
OAuth is recommended for GPTs you'll publish — users get a seamless login experience.
API Key is faster for personal testing — just copy and paste your key.

Available Actions for ChatGPT

Once connected, your Custom GPT can:

directions_run Step-by-Step: Connect ProdMoh MCP to Your IDE

This guide walks you through installing, configuring, and verifying your MCP connection so your IDE can fetch PRDs, user stories, and acceptance criteria in real-time.

Step 1 — Get Your MCP Token

1. Open your ProdMoh Dashboard
2. Go to Share with Developers
3. Click Enable IDE Assistant
4. Copy the token (keep it private)

Step 2 — Create Your MCP Config File

Create or update the following file:

~/.cursor/mcp.json
# or
~/.vscode/mcp.json

Add this content:

{
  "mcpServers": {
    "prodmoh": {
      "type": "sse",
      "url": "https://prodmoh.com/sse",
      "headers": {
        "x-prodmoh-token": "YOUR_TOKEN_HERE"
      }
    }
  }
}
Step 3 — Restart Your IDE

MCP servers are loaded on startup.
After editing mcp.json, do:

  • VS Code → Developer: Reload Window
  • Cursor → CMD + R
  • Zed, Windsurf, etc. → Restart
Step 4 — Test the Connection

Inside your IDE, open the Command Palette:

  • @ProdMoh list resources
  • @ProdMoh search "checkout flow"
  • @ProdMoh read prd 102

You should now see:

Connected to ProdMoh MCP (SSE)
Resources Loaded:  ✓ PRDs, ✓ User Stories, ✓ Research Docs
Step 5 — Use It Inside Your Code

Start coding with PRD-aware guidance. Examples:

// Generate a Zod schema
@ProdMoh Read the Signup PRD. Build a Zod schema for POST /register.

// Generate a DB model
@ProdMoh Analyze "Data Requirements" section. Create a Prisma model for Order.

// Improve tests
@ProdMoh Audit my test file payment.test.ts using PRD 145.

ProdMoh will fetch only the relevant doc fragments and inject them as grounded context into your LLM.

check_circle
That's it! Your IDE is now fully connected to the ProdMoh MCP server using SSE, enabling real-time PRD-aware development and automated system design.

storage Advanced: Generating DB Schemas

One of the highest-leverage uses of MCP is translating business data requirements directly into database models. This reduces the risk of field mismatch.

The Workflow

1. Ensure your PRD contains a section on "Data Requirements" or "User Attributes".

2. Use the following prompt pattern:

User: @ProdMoh Reference the "User Profile" PRD. Generate the Prisma Schema (schema.prisma) for the User model. Ensure all field constraints (unique, optional, defaults) match the document exactly.

The Output (Example):

model User {
  id        String   @id @default(uuid())
  email     String   @unique // Derived from PRD Req #2.1
  role      Role     @default(USER)
  // PRD Req #2.4: "Users must have a bio, max 500 chars"
  bio       String?  @db.VarChar(500) 
  createdAt DateTime @default(now())
}

api Advanced: API Contracts

Prevent frontend-backend desync by generating Zod schemas or TypeScript interfaces directly from the acceptance criteria.

Prompt Engineering for Zod

User: @ProdMoh Read the "Signup Flow" requirements. Create a Zod validation schema for the POST /register endpoint. Pay attention to password complexity rules defined in the "Security" section.

The MCP server will retrieve the specific regex rules for passwords defined by your PM and enforce them in the generated code:

const RegisterSchema = z.object({
  email: z.string().email(),
  // Rules pulled from ProdMoh Doc #102
  password: z.string()
    .min(12, "Password must be 12 chars per Sec-Req-01")
    .regex(/[A-Z]/, "Must contain uppercase per Sec-Req-02"),
});

science Advanced TDD & Mocking

Moving beyond simple unit tests, ProdMoh MCP can orchestrate complex testing scenarios by understanding the intent behind your requirements.

1. The "QA Critic" Workflow

Before you commit, ask the MCP agent to audit your test suite against the PRD. It acts as an automated QA engineer.

User: @ProdMoh Read "Payment Flow Requirements". Now look at my current file payment.test.ts. List 3 edge cases defined in the PRD that I have NOT covered in these tests.

Common catch: "You tested for successful payments, but the PRD Section 4.2 specifies that a '402 Payment Required' error must occur if the card balance is below $10. Your test suite misses this."

2. Generating Realistic Mock Data

Hardcoding test data is brittle. Use MCP to generate Mock Data Factories that strictly adhere to your data validation rules.

// Prompt: "@ProdMoh Generate a Faker.js factory for the 'Order' object based on PRD constraints."

export const createMockOrder = () => ({
  id: faker.string.uuid(),
  // PRD Req: "Order total cannot be negative"
  total: faker.number.float({ min: 0.01, max: 9999 }), 
  // PRD Req: "Status must be PENDING, SHIPPED, or DELIVERED"
  status: faker.helpers.arrayElement(['PENDING', 'SHIPPED', 'DELIVERED']),
  items: []
});

3. Integration Test Scenarios

For end-to-end (E2E) testing with tools like Playwright or Cypress, you can generate the entire user journey script.

Prompt Pattern

@ProdMoh Read the "New User Onboarding" User Story. Generate a Cypress E2E test file that steps through the entire flow described in the "Happy Path" section.

security Security & Token Management

Your ProdMoh MCP token is a sensitive credential. It grants read access to your organization's internal product documentation. Treat it like an API Key.

Best Practices

Revoking Access

If a token is compromised or a developer leaves the team, you can secure your data instantly:

gpp_bad
Action: Regenerate Token
Go to the ProdMoh Dashboard > "Share with Developers" > Click Regenerate. This immediately invalidates the old string and severs all active IDE connections.

build Troubleshooting

Issue: Connection Refused

Cause: Invalid Token or Firewall.
Fix: Regenerate token in dashboard. Ensure port 443 is open.

Issue: "Context Empty"

Cause: PRD is empty or lacks text content.
Fix: Ensure the document has saved content in the ProdMoh editor.

warning
Restart Required: If you update your mcp.json config file, you typically need to restart the IDE (or reload window) for the changes to take effect.