Quickstart with .NET (C#)
Call your PromptHelm prompts from any .NET application.
This quickstart walks you through minting an API token, installing the
PromptHelm .NET SDK, and making your first call from C#. By the end
you will have a working completion, an IAsyncEnumerable streaming
example, and a checklist for shipping to ASP.NET Core or any other
.NET host.
SDK status
The .NET SDK is in pre-release. The PromptHelm.Sdk NuGet
package is being staged on nuget.org; track
Runivox/prompt-helm-sdk-dotnet
for the release announcement. The public API below is stable.
Prerequisites
- .NET 8 SDK or newer (the package multi-targets
net8.0andnetstandard2.0). - A PromptHelm account. Join the waitlist if you need an invite.
Sign in to the dashboard and open Settings → API tokens. Click New token, name it (e.g.
dotnet-service-dev), and copy the value immediately — tokens are revealed exactly once.Store the token in user-secrets for local development, and in your secrets manager (Azure Key Vault, AWS Secrets Manager, HashiCorp Vault) for staging and production:
dotnet user-secrets init dotnet user-secrets set "PromptHelm:ApiKey" "ph_live_your_token_here"Add the package via the .NET CLI, the Visual Studio NuGet UI, or by editing your
.csproj.dotnet add package PromptHelm.SdkIn the dashboard, navigate to Prompts → New prompt. Give it a slug (for example,
welcome), pick a default model, and use{{ variable_name }}syntax for runtime variables. Saving publishesv1on themainenvironment.Learn more
See Concepts → Prompts for versions, environments, and promotion semantics.
Build the client with a configuration object and
awaitthe first response. The example below reads the API key from the process environment for local runs; in production, swap to your DI-managed configuration provider.Program.cs using PromptHelm.Sdk; var ph = new PromptHelmClient(new PromptHelmConfig { ApiKey = Environment.GetEnvironmentVariable("PROMPTHELM_API_KEY")!, }); var response = await ph.ExecuteAsync(new ExecuteRequest { PromptSlug = "welcome", Variables = new Dictionary<string, string> { ["name"] = "World" }, }); Console.WriteLine(response.Output);The call appears in the dashboard's Logs view with the full request/response payload, the per-call cost, and the round-trip latency.
For chat-style endpoints, the SDK exposes an
IAsyncEnumerable<StreamEvent>that integrates withawait foreach, ASP.NET Core minimal APIs, and SignalR hubs out of the box.Stream.cs await foreach (var ev in ph.StreamAsync(new ExecuteRequest { PromptSlug = "welcome", Variables = new Dictionary<string, string> { ["name"] = "World" }, })) { switch (ev) { case ChunkEvent chunk: Console.Write(chunk.Content); break; case DoneEvent done: Console.WriteLine($"\n{done.TotalTokens} tokens, ${done.Cost}"); break; case ErrorEvent err: Console.Error.WriteLine($"Error {err.ErrorCode}: {err.Message}"); break; } }Pass a
CancellationToken(for exampleHttpContext.RequestAborted) intoStreamAsyncso the SSE connection closes when the caller disconnects.Before you point real traffic at PromptHelm, run through this checklist:
-
Register the client through DI. In ASP.NET Core, prefer the built-in extension over
new PromptHelmClient(...):builder.Services.AddPromptHelm(opts => opts.ApiKey = builder.Configuration["PromptHelm:ApiKey"]!);This wires up a singleton with a pooled
HttpClientand graceful shutdown. -
Propagate
CancellationTokeneverywhere. PassHttpContext.RequestAborted(or the worker's stopping token) to everyExecuteAsync/StreamAsynccall so cancellation reaches the SDK. -
Catch typed errors. Every call throws
PromptHelmExceptionwith a stableErrorCode. Map known codes to retries or user-facing responses; let unknown codes bubble to your error reporter. -
Source the API key from configuration. Use user-secrets in development, environment variables in containers, and a managed secrets store in production. Never check the key into source control.
-
Pin an environment. Default
mainto production traffic and promote new prompt versions throughstaginganddevfirst.
-