PPromptHelm Docs

Welcome to PromptHelm

An overview of the PromptHelm prompt management platform — what it is, who it's for, and how to get started.

PromptHelm is the prompt management platform built for LLM engineering teams. It treats prompts as versioned code artifacts with main / staging / dev branches, routes traffic across every major provider through a single gateway, and surfaces cost, latency, and cache-hit ratio as first-class metrics.

Who PromptHelm is for

  • Engineering teams shipping LLM features to real users.
  • Platform teams who need to consolidate provider keys, audit trails, and cost reporting in one place.
  • Builders who want a prompt history more rigorous than "git blame app/prompts.ts".

What you get

Architecture at a glance

PromptHelm is a multi-tenant SaaS made of three independent surfaces:

  1. Control plane — manages prompts, versions, environments, provider keys, and analytics. Authenticated via short-lived JWTs.
  2. Gateway — a streaming proxy that dispatches completions to OpenAI, Anthropic, Gemini, or DeepSeek. Authenticated with API tokens.
  3. Dashboards & SDKs — the web app at app.prompthelm.app and the official Node and Python SDKs that wrap the control plane and gateway.

Private beta

PromptHelm is currently invite-only. Join the waitlist for early access — we're rolling invites out weekly.

On this page