Soenneker.SemanticKernel.Pool
3.0.31
Prefix Reserved
See the version list below for details.
dotnet add package Soenneker.SemanticKernel.Pool --version 3.0.31
NuGet\Install-Package Soenneker.SemanticKernel.Pool -Version 3.0.31
<PackageReference Include="Soenneker.SemanticKernel.Pool" Version="3.0.31" />
<PackageVersion Include="Soenneker.SemanticKernel.Pool" Version="3.0.31" />
<PackageReference Include="Soenneker.SemanticKernel.Pool" />
paket add Soenneker.SemanticKernel.Pool --version 3.0.31
#r "nuget: Soenneker.SemanticKernel.Pool, 3.0.31"
#:package Soenneker.SemanticKernel.Pool@3.0.31
#addin nuget:?package=Soenneker.SemanticKernel.Pool&version=3.0.31
#tool nuget:?package=Soenneker.SemanticKernel.Pool&version=3.0.31
Soenneker.SemanticKernel.Pool
A high-performance, thread-safe pool implementation for Microsoft Semantic Kernel instances with built-in rate limiting capabilities.
Features
- Kernel Pooling: Efficiently manages and reuses Semantic Kernel instances
- Rate Limiting: Built-in support for request rate limiting at multiple time windows:
- Per-second rate limiting
- Per-minute rate limiting
- Per-day rate limiting
- Token-based rate limiting
- Thread Safety: Fully thread-safe implementation using concurrent collections
- Async Support: Modern async/await patterns throughout the codebase
- Flexible Configuration: Configurable rate limits and pool settings
- Resource Management: Automatic cleanup of expired rate limit windows
Installation
dotnet add package Soenneker.SemanticKernel.Pool
services.AddSemanticKernelPoolAsSingleton()
Extension Packages
This library has several extension packages for different AI providers:
- Soenneker.SemanticKernel.Pool.Gemini - Google Gemini integration
- Soenneker.SemanticKernel.Pool.OpenAi - OpenAI/OpenRouter.ai/etc integration
- Soenneker.SemanticKernel.Pool.Ollama - Ollama integration
- Soenneker.SemanticKernel.Pool.OpenAi.Azure - Azure OpenAI integration
Usage
Startup Configuration
// In Program.cs or Startup.cs
public class Program
{
public static async Task Main(string[] args)
{
var builder = WebApplication.CreateBuilder(args);
// Add the kernel pool as a singleton
builder.Services.AddSemanticKernelPoolAsSingleton();
var app = builder.Build();
// Register kernels during startup
var kernelPool = app.Services.GetRequiredService<ISemanticKernelPool>();
// Manually create options, or use one of the extensions mentioned above
var options = new SemanticKernelOptions
{
ApiKey = "your-api-key",
Endpoint = "https://api.openai.com/v1",
Model = "gpt-4",
KernelFactory = async (opts, _) =>
{
return Kernel.CreateBuilder()
.AddOpenAIChatCompletion(modelId: opts.ModelId!,
new OpenAIClient(new ApiKeyCredential(opts.ApiKey), new OpenAIClientOptions {Endpoint = new Uri(opts.Endpoint)}));
}
// Rate Limiting
RequestsPerSecond = 10,
RequestsPerMinute = 100,
RequestsPerDay = 1000,
TokensPerDay = 10000
};
await kernelPool.Register("my-kernel", options);
// Add more registrations... order matters!
await app.RunAsync();
}
}
Using the Pool
public class MyService
{
private readonly ISemanticKernelPool _kernelPool;
public MyService(ISemanticKernelPool kernelPool)
{
_kernelPool = kernelPool;
}
public async Task ProcessAsync()
{
// Get an available kernel that's within its rate limits, preferring the first registered
var (kernel, entry) = await _kernelPool.GetAvailableKernel();
// Get the chat completion service
var chatCompletionService = kernel.GetService<IChatCompletionService>();
// Create a chat history
var chatHistory = new ChatHistory();
chatHistory.AddMessage(AuthorRole.User, "What is the capital of France?");
// Execute chat completion
var response = await chatCompletionService.GetChatMessageContentAsync(chatHistory);
Console.WriteLine($"Response: {response.Content}");
// Access rate limit information through the entry
var remainingQuota = await entry.RemainingQuota();
Console.WriteLine($"Remaining requests - Second: {remainingQuota.Second}, Minute: {remainingQuota.Minute}, Day: {remainingQuota.Day}");
}
}
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net9.0
- Soenneker.SemanticKernel.Cache (>= 3.0.442)
NuGet packages (5)
Showing the top 5 NuGet packages that depend on Soenneker.SemanticKernel.Pool:
Package | Downloads |
---|---|
Soenneker.SemanticKernel.Pool.OpenAi
Provides OpenAI-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.OpenAi.Azure
Provides Azure OpenAI-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.Ollama
Provides Ollama-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.Gemini
Provides Gemini-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.Mistral
Provides Mistral-specific registration extensions for KernelPoolManager, enabling integration via Semantic Kernel. |
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last Updated |
---|---|---|
3.0.69 | 270 | 7/9/2025 |
3.0.68 | 175 | 7/9/2025 |
3.0.67 | 158 | 7/9/2025 |
3.0.66 | 128 | 7/9/2025 |
3.0.65 | 169 | 7/9/2025 |
3.0.64 | 168 | 7/8/2025 |
3.0.63 | 330 | 7/4/2025 |
3.0.62 | 265 | 7/2/2025 |
3.0.61 | 258 | 6/28/2025 |
3.0.60 | 92 | 6/28/2025 |
3.0.59 | 127 | 6/28/2025 |
3.0.58 | 136 | 6/28/2025 |
3.0.57 | 69 | 6/27/2025 |
3.0.56 | 105 | 6/27/2025 |
3.0.55 | 196 | 6/26/2025 |
3.0.54 | 161 | 6/25/2025 |
3.0.53 | 184 | 6/25/2025 |
3.0.52 | 168 | 6/25/2025 |
3.0.51 | 312 | 6/17/2025 |
3.0.50 | 358 | 6/11/2025 |
3.0.49 | 316 | 6/11/2025 |
3.0.48 | 331 | 6/11/2025 |
3.0.47 | 351 | 6/11/2025 |
3.0.46 | 309 | 6/11/2025 |
3.0.45 | 345 | 6/10/2025 |
3.0.44 | 298 | 6/3/2025 |
3.0.43 | 175 | 6/3/2025 |
3.0.42 | 180 | 6/3/2025 |
3.0.41 | 168 | 6/3/2025 |
3.0.40 | 153 | 6/3/2025 |
3.0.39 | 174 | 6/3/2025 |
3.0.38 | 180 | 6/3/2025 |
3.0.37 | 170 | 6/2/2025 |
3.0.36 | 242 | 5/28/2025 |
3.0.35 | 174 | 5/28/2025 |
3.0.34 | 183 | 5/28/2025 |
3.0.33 | 152 | 5/28/2025 |
3.0.32 | 189 | 5/27/2025 |
3.0.31 | 167 | 5/27/2025 |
3.0.30 | 143 | 5/27/2025 |
3.0.29 | 224 | 5/27/2025 |
3.0.28 | 208 | 5/26/2025 |
3.0.27 | 127 | 5/25/2025 |
3.0.26 | 124 | 5/25/2025 |
3.0.25 | 113 | 5/23/2025 |
3.0.24 | 135 | 5/23/2025 |
3.0.23 | 126 | 5/23/2025 |
3.0.22 | 121 | 5/23/2025 |
3.0.21 | 128 | 5/23/2025 |
3.0.20 | 155 | 5/23/2025 |
3.0.19 | 150 | 5/22/2025 |
3.0.18 | 147 | 5/22/2025 |
3.0.17 | 325 | 5/22/2025 |
3.0.16 | 157 | 5/21/2025 |
3.0.15 | 191 | 5/20/2025 |
3.0.14 | 183 | 5/19/2025 |
3.0.13 | 169 | 5/19/2025 |
3.0.12 | 137 | 5/19/2025 |
3.0.11 | 140 | 5/19/2025 |
3.0.10 | 159 | 5/19/2025 |
3.0.9 | 144 | 5/19/2025 |
3.0.8 | 201 | 5/19/2025 |
3.0.7 | 137 | 5/18/2025 |
3.0.6 | 151 | 5/18/2025 |
3.0.5 | 153 | 5/18/2025 |
3.0.4 | 135 | 5/18/2025 |
3.0.3 | 140 | 5/18/2025 |
3.0.2 | 139 | 5/18/2025 |
3.0.1 | 137 | 5/18/2025 |