PromptStream.AI
0.1.5
See the version list below for details.
dotnet add package PromptStream.AI --version 0.1.5
NuGet\Install-Package PromptStream.AI -Version 0.1.5
<PackageReference Include="PromptStream.AI" Version="0.1.5" />
<PackageVersion Include="PromptStream.AI" Version="0.1.5" />
<PackageReference Include="PromptStream.AI" />
paket add PromptStream.AI --version 0.1.5
#r "nuget: PromptStream.AI, 0.1.5"
#:package PromptStream.AI@0.1.5
#addin nuget:?package=PromptStream.AI&version=0.1.5
#tool nuget:?package=PromptStream.AI&version=0.1.5
<p align="center"> <img src="https://github.com/AndrewClements84/PromptStream.AI/blob/master/assets/logo.png?raw=true" alt="PromptStream.AI" width="500"/> </p>
๐ง PromptStream.AI
๐งฉ Description
PromptStream.AI โ Token-aware prompt composition, validation, and conversational context toolkit for .NET.
Built atop Flow.AI.Core and TokenFlow.AI,
PromptStream.AI enables developers to compose, validate, generate, and manage multi-turn AI prompts with token budgeting, interpolation, and contextual memory.
โ๏ธ Key Features
- ๐งฉ Token-aware prompt builder with variable interpolation (
{{variable}}syntax) - โ Validation engine for token limits, structure, and completeness
- ๐ข Integrated token counting via
ITokenFlowProvider - ๐ง Context manager with replay, merge, summarization, and JSON persistence
- ๐ฌ Assistant reply tracking and multi-turn chat history
- ๐พ Persistent context storage (
ToJson/LoadFromJson) - ๐งฎ Token budgeting tools (
EstimateTokenUsage,TrimToTokenBudget) - โก CLI utility (
PromptStream.AI.CLI) for building, validating, and generating prompts - ๐ Seamless integration with TokenFlow.AI for model-aware tokenization
๐ Installation
dotnet add package PromptStream.AI
Requires:
- .NET 8.0 or higher
- Flow.AI.Core
- (optional) TokenFlow.AI for model-specific token counting
๐ง Quickstart Example
using System;
using System.Collections.Generic;
using TokenFlow.AI.Integration;
using PromptStream.AI.Models;
using PromptStream.AI.Services;
// Create a token provider (TokenFlow.AI adapter)
var tokenProvider = new TokenFlowProvider("gpt-4o-mini");
// Initialize PromptStream service (with context-aware memory)
var service = new PromptStreamService(tokenProvider);
// Define a prompt template
var template = new PromptTemplate
{
Id = "summarize-v1",
Template = "Summarize the following:\n\n{{input}}\n\nBe concise.",
RequiredVariables = new() { "input" }
};
// Build and validate
var variables = new Dictionary<string, string>
{
["input"] = "Flow.AI enables composable AI workflows for .NET developers."
};
var (instance, validation) = service.BuildAndValidate(template, variables);
if (validation.IsValid)
{
Console.WriteLine($"โ
Valid prompt ({validation.TokenCount} tokens)");
Console.WriteLine(instance.RenderedText);
}
else
{
Console.WriteLine($"โ Invalid: {string.Join(", ", validation.Errors)}");
}
// Add an assistant reply for multi-turn context
service.AddAssistantReply("Sure! Here's a short summary...");
๐ป CLI Usage (PromptStream.AI.CLI)
PromptStream.AI now includes a lightweight command-line interface for developers to build, validate, and generate prompts directly from the terminal.
๐งฉ Build a prompt
dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- build --template "Hello {{name}}" --var name=Andrew
โ Validate a prompt
dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- validate --template "Summarize {{topic}}" --var topic="AI in .NET"
๐ค Generate a model response
dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- generate --template "Explain {{concept}}" --var concept="tokenization" --save context.json
๐ง Manage conversation context
dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- context --load context.json --summarize
Available commands:
| Command | Description |
|----------|--------------|
| build | Render a prompt with variable substitution |
| validate | Validate prompt completeness and token limits |
| generate | Build, validate, and produce a model-like response |
| context | Load, save, summarize, or clear conversation context |
๐งฉ Namespace Overview
| Namespace | Description |
|---|---|
PromptStream.AI.Models |
Core prompt, validation, and message models |
PromptStream.AI.Builders |
Responsible for rendering prompt templates |
PromptStream.AI.Validation |
Handles validation, token limits, and structure |
PromptStream.AI.Services |
High-level orchestrator combining build, validate, and context |
PromptStream.AI.Context |
Manages conversational memory and context tracking |
PromptStream.AI.Extensions |
Utility helpers for interpolation and cleanup |
PromptStream.AI.CLI |
Developer command-line interface for building and testing prompts |
๐งช Testing & Coverage
dotnet test --collect:"XPlat Code Coverage"
All core logic is tested using xUnit and coverlet, with full integration into Codecov for continuous coverage tracking.
๐บ๏ธ Roadmap
| Feature | Status |
|---|---|
Core Models (PromptTemplate, PromptInstance, PromptValidationResult) |
โ |
| PromptBuilder & Validator | โ |
| Context Manager & Message Tracking | โ |
| Context Enhancements (merge, summarize, token budgeting, JSON persistence) | โ |
| PromptStreamService orchestration layer | โ |
| Assistant reply integration | โ |
| CLI tool for build, validate, generate, context | โ |
| Token cost estimator (TokenFlow.AI integration) | ๐ Planned |
| OpenAI-style function call validation | ๐ Planned |
| Visual prompt editor in Flow.AI Studio | ๐ Planned |
๐ค Contributing
Contributions are welcome!
If youโd like to extend functionality or improve coverage, please fork the repo and submit a pull request.
See CONTRIBUTING.md for guidelines.
๐ License
This project is licensed under the MIT License โ see the LICENSE file for details.
Part of the Flow.AI Ecosystem
ยฉ 2025 Andrew Clements
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Flow.AI.Core (>= 0.1.0)
- Microsoft.Extensions.DependencyInjection.Abstractions (>= 9.0.9)
- TokenFlow.AI (>= 0.7.8)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on PromptStream.AI:
| Package | Downloads |
|---|---|
|
PromptStream.AI.Integration.TokenFlow
Integration adapter connecting PromptStream.AI with TokenFlow.AI for model-aware tokenization. |
GitHub repositories
This package is not used by any popular GitHub repositories.