Skip to the content.

ADR-003: BYOK Provider Pattern

Status: Accepted
Date: 2026-05-09

Context

Embercore orchestrates LLM calls across multiple agents (Athena, Apollo, Hermes, Hephaestus). We needed to decide how users provide LLM access:

  1. Bundled API key — Ship a shared key. Simple UX, but creates billing liability, rate-limit contention, and security risk.
  2. BYOK (Bring Your Own Key) — Users supply their own API keys via environment variables. More setup, but zero liability and full user control.
  3. Self-hosted only — Require local models. Limits capability and audience.

Decision

We implement a BYOK provider pattern with a unified interface, env-based factory, and custom base URL support.

Provider interface (internal/provider/provider.go)

type Provider interface {
    Complete(ctx context.Context, messages []Message, opts Options) (*Response, error)
    Name() string
}

All agents accept a Provider via the WithProvider functional option (see ADR-008). Agents are provider-agnostic — they call Complete() without knowing which LLM backend is in use.

Factory pattern (internal/provider/factory.go)

NewFromEnv() constructs the appropriate provider from environment variables:

Variable Purpose Default
EMBERCORE_PROVIDER Select backend (anthropic or openai) anthropic
ANTHROPIC_API_KEY Anthropic API authentication
OPENAI_API_KEY OpenAI API authentication
EMBERCORE_MODEL Override the default model Provider-specific
OPENAI_BASE_URL Custom endpoint for OpenAI-compatible APIs OpenAI default

Custom base URL support

The OpenAI provider accepts WithBaseURL(url) (internal/provider/openai.go), enabling:

The Anthropic provider also supports HERALD_ANTHROPIC_BASE_URL for proxied/self-hosted Anthropic-compatible endpoints.

Default models

Provider Default Model
Anthropic claude-sonnet-4-20250514
OpenAI Determined by EMBERCORE_MODEL

Consequences

Benefits:

Trade-offs:

Related decisions: