Skip to the content.

ADR-006: MCP as Integration Protocol

Status: Accepted
Date: 2026-05-09

Context

Embercore’s engine needs to be invocable by external AI assistants, IDEs, and automation tools. We considered three integration approaches:

  1. REST API — Standard HTTP endpoints. Well-understood, but requires a running server, port management, and HTTP client boilerplate.
  2. gRPC — High-performance binary protocol. Excellent for service-to-service, but heavy tooling (protobuf codegen), poor browser support, and unfamiliar to most AI tool ecosystems.
  3. MCP (Model Context Protocol) — Emerging standard for AI tool integration. Stdio-based transport, JSON-RPC messaging, tool-oriented interface. Designed specifically for LLM ↔ tool communication.

Decision

We use MCP (Model Context Protocol) as the primary integration protocol, implemented via github.com/mark3labs/mcp-go v0.51.0.

Transport: stdio (default) + HTTP (optional)

The engine binary supports two modes (main.go):

// Default: stdio MCP server (for AI assistant integration)
// With -http flag: HTTP server on -addr (default :8080)

Tool-based interface

The engine registers 8 MCP tools in buildServer():

Tool Purpose Handler
run_workflow Execute a full plan-first workflow tools/run_workflow.go
pm_plan Generate a marketing plan via Athena tools/pm_plan.go
run_research Execute a research task tools/run_research.go
run_brand Execute brand/positioning work tools/run_brand.go
run_ux Execute UX-related tasks tools/run_ux.go
run_gtm Execute go-to-market tasks tools/run_gtm.go
request_approval Checkpoint approval gate tools/request_approval.go
assemble_plan Assemble outputs into final artifacts tools/assemble_plan.go

Each tool is a self-contained handler that receives structured input, invokes the appropriate agent(s), and returns structured output — all over JSON-RPC.

Protocol conventions

Consequences

Benefits:

Trade-offs:

Related decisions: