LLMKit 3.0.0
dotnet add package LLMKit --version 3.0.0
NuGet\Install-Package LLMKit -Version 3.0.0
<PackageReference Include="LLMKit" Version="3.0.0" />
<PackageVersion Include="LLMKit" Version="3.0.0" />
<PackageReference Include="LLMKit" />
paket add LLMKit --version 3.0.0
#r "nuget: LLMKit, 3.0.0"
#addin nuget:?package=LLMKit&version=3.0.0
#tool nuget:?package=LLMKit&version=3.0.0
LLMKit
LLMKit is a thread-safe .NET library that provides a unified interface for interacting with various Large Language Models (LLMs) including OpenAI, Gemini, and DeepSeek.
Features
- Unified interface for multiple LLM providers
- Thread-safe implementation
- Built-in conversation management (stores up to 15 messages by default)
- ✨ NEW! Multimodal support (text + images)
- Fluent API for message building
- Configurable parameters (tokens, temperature, etc.)
- Comprehensive error handling with automatic retries
- Dependency injection support
- Cancellation token support
- Custom endpoint support for all providers
Installation
Package Manager
Install-Package LLMKit
.NET CLI
dotnet add package LLMKit
Clone Repository
git clone https://github.com/MohammedJayyab/LLMKit.git
Requirements
- .NET 8.0 or later
- Valid API keys for the LLM providers
Quick Start
using LLMKit;
using LLMKit.Providers;
// Using statement ensures proper disposal
using var client = new LLMClient(
new OpenAIProvider(apiKey: "your-api-key", model: "gpt-3.5-turbo")
);
// Conversation history is automatically managed
string response = await client.GenerateTextAsync("What is the capital of France?");
Console.WriteLine(response);
Usage Examples
Conversation Management
using var client = new LLMClient(new OpenAIProvider("your-api-key", "gpt-3.5-turbo"));
// First message
string response1 = await client.GenerateTextAsync("Hello, how are you?");
Console.WriteLine(response1);
// Follow-up questions maintain conversation context automatically
string response2 = await client.GenerateTextAsync("What's the weather like?");
Console.WriteLine(response2);
// Get the formatted conversation history
string history = client.GetFormattedConversation();
Console.WriteLine(history);
// Clear the conversation if needed
client.ClearConversation();
✨ Multimodal Support (Image + Text)
// Generate a response to a message with an image
string response = await client.GenerateTextWithImageAsync(
"What can you see in this image?",
"path/to/your/image.jpg"
);
Custom Parameters
// Create client with custom parameters
var client = new LLMClient(
provider: new OpenAIProvider("your-api-key", "gpt-3.5-turbo"),
maxTokens: 1000,
temperature: 0.7,
maxMessages: 20 // Store up to 20 messages in conversation history
);
✨ Setting a System Message
// Set or update the system message
client.SetSystemMessage("You are a helpful assistant specialized in biology.");
Dependency Injection
services.AddSingleton<ILLMProvider>(sp =>
new OpenAIProvider(
apiKey: Configuration["OpenAI:ApiKey"],
model: Configuration["OpenAI:Model"]
)
);
services.AddSingleton<LLMClient>();
Error Handling
try
{
string response = await client.GenerateTextAsync("What is the capital of France?");
Console.WriteLine(response);
}
catch (LLMException ex)
{
Console.WriteLine($"Error: {ex.Message}");
}
Cancellation
using var cts = new CancellationTokenSource(TimeSpan.FromSeconds(30));
string response = await client.GenerateTextAsync(
"What is the capital of France?",
cts.Token
);
Supported Providers
OpenAI
var provider = new OpenAIProvider(
apiKey: "your-api-key",
model: "gpt-3.5-turbo"
);
Gemini
var provider = new GeminiProvider(
apiKey: "your-api-key",
model: "gemini-2.0-flash"
);
DeepSeek
var provider = new DeepSeekProvider(
apiKey: "your-api-key",
model: "deepseek-chat"
);
Custom Endpoints
Each provider supports custom endpoints. If not provided, the library will use the default endpoint for that provider.
// OpenAI with custom endpoint
var client = new LLMClient(new OpenAIProvider(
apiKey: "your-api-key",
model: "gpt-3.5-turbo",
endpoint: new Uri("https://api.openai.com/v1/chat/completions")
));
// Gemini with custom endpoint
var client = new LLMClient(new GeminiProvider(
apiKey: "your-api-key",
model: "gemini-2.0-flash",
endpoint: new Uri("https://generativelanguage.googleapis.com/v1beta/models")
));
// DeepSeek with custom endpoint
var client = new LLMClient(new DeepSeekProvider(
apiKey: "your-api-key",
model: "deepseek-chat",
endpoint: new Uri("https://api.deepseek.com/v1/chat/completions")
));
Documentation 📚
For more detailed information, please visit our Wiki:
License
MIT License. See LICENSE for details.
Support
For issues or questions, please open an issue in the GitHub repository.
☕ Donate
If you find this project helpful, consider buying me a coffee to support its development:
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- System.Text.Json (>= 9.0.3)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version 3.0.0:
- Added multimodal support for text and images
- Enhanced conversation management with improved message history
- Added system message configuration
- Implemented automatic retries for failed requests
- Improved thread safety and error handling
- Updated to .NET 8.0
- Added comprehensive XML documentation
- Enhanced provider implementations with better error handling