Quickstart with Node.js
Get your first PromptHelm response in under 60 seconds with the official Node.js SDK.
This quickstart walks you through minting an API token, installing
@prompt-helm/sdk, and making your first call from a Node.js process. By
the end you will have a working completion against a published prompt,
plus a streaming example and a short production checklist.
Prerequisites
- Node.js 18 or newer (we test against 18, 20, and 22).
- A PromptHelm account. If you do not have one, join the waitlist.
Sign in to the dashboard and open Settings → API tokens. Click New token, give it a memorable name (for example,
local-dev-quickstart), and copy the value immediately — tokens are revealed exactly once.One reveal only
PromptHelm never stores plaintext tokens. If you lose the value, revoke the token and mint a new one.
Store the token in an environment variable so it never lands in source control:
.env PROMPTHELM_API_KEY=ph_live_your_token_hereAdd
@prompt-helm/sdkto your project. We officially support npm, pnpm, and yarn:npm install @prompt-helm/sdkThe SDK ships ESM-first with full TypeScript types. CommonJS works via the auto-generated dual-export entrypoint.
In the dashboard, navigate to Prompts → New prompt. Give it a slug (for example,
support-triage), pick a default model, and write the prompt body. Use{{ variable_name }}syntax for runtime variables.When you click Save, PromptHelm publishes a
v1on themainenvironment. The slug + environment combination is the contract your SDK calls will reference.Learn more
See Concepts → Prompts for the full data model: versions, environments, and promotion flows.
Create a small script and import the SDK. The client picks up
PROMPTHELM_API_KEYfromprocess.envautomatically.src/quickstart.ts import { PromptHelm } from "@prompt-helm/sdk"; const client = new PromptHelm({ // apiKey defaults to process.env.PROMPTHELM_API_KEY. // Specify it explicitly only when running outside Node (e.g. edge runtimes). }); async function main() { const result = await client.execute({ promptSlug: "support-triage", environment: "main", variables: { ticket: "Password reset email never arrived.", }, }); console.log("Response:", result.output); console.log("Cost (USD):", result.usage.costUsd); console.log("Latency (ms):", result.metrics.latencyMs); } main().catch((err) => { console.error(err); process.exit(1); });Run it:
node --env-file=.env src/quickstart.tsYou should see the model output, the per-call cost, and the round-trip latency. The same call is recorded in the dashboard's Logs view with the full request/response payload.
For interactive UIs, switch to the streaming API. The SDK exposes an
AsyncIterableso you canfor awaitchunks straight to the client.src/stream.ts import { PromptHelm } from "@prompt-helm/sdk"; const client = new PromptHelm(); const stream = await client.stream({ promptSlug: "support-triage", environment: "main", variables: { ticket: "How do I rotate my API key?", }, }); for await (const chunk of stream) { if (chunk.type === "delta") { process.stdout.write(chunk.text); } if (chunk.type === "done") { process.stdout.write("\n"); console.log("Cost (USD):", chunk.usage.costUsd); } }Streaming uses Server-Sent Events under the hood and works in every modern Node runtime, including Workers and Edge Functions.
Before you point real traffic at PromptHelm, run through this checklist:
- Store the API token in your secrets manager. Never commit it.
- Wrap calls with error handling. Every SDK method throws a typed
PromptHelmErrorwith a stablecode. Map known codes to retries or user-facing messages. - Set timeouts. Pass a
signal: AbortSignal.timeout(15_000)(or the SDK's built-intimeoutMsoption) so a slow provider does not pin a Node worker. - Pin an environment. Default
mainto production traffic and promote new prompt versions throughstaginganddevfirst. - Watch the cost dashboard. PromptHelm tags every request with
tenantId,promptSlug, andenvironmentfor slice-and-dice reporting.