tryAGI.OpenAI 3.9.3-dev.14

Prefix Reserved
This is a prerelease version of tryAGI.OpenAI.
dotnet add package tryAGI.OpenAI --version 3.9.3-dev.14                
NuGet\Install-Package tryAGI.OpenAI -Version 3.9.3-dev.14                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="tryAGI.OpenAI" Version="3.9.3-dev.14" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add tryAGI.OpenAI --version 3.9.3-dev.14                
#r "nuget: tryAGI.OpenAI, 3.9.3-dev.14"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install tryAGI.OpenAI as a Cake Addin
#addin nuget:?package=tryAGI.OpenAI&version=3.9.3-dev.14&prerelease

// Install tryAGI.OpenAI as a Cake Tool
#tool nuget:?package=tryAGI.OpenAI&version=3.9.3-dev.14&prerelease                

OpenAI

Nuget package dotnet License: MIT Discord

Features 🔥

  • Fully generated C# SDK based on official OpenAI OpenAPI specification using AutoSDK
  • Same day update to support new features
  • Updated and supported automatically if there are no breaking changes
  • Contains a supported list of constants such as current prices, models, and other
  • Source generator to define functions natively through C# interfaces
  • All modern .NET features - nullability, trimming, NativeAOT, etc.
  • Support .Net Framework/.Net Standard 2.0
  • Support all OpenAI API endpoints including completions, chat, embeddings, images, assistants and more.
  • Regularly tested for compatibility with popular custom providers like OpenRouter/DeepSeek/Ollama/LM Studio and many others

Documentation

Examples and documentation can be found here: https://tryagi.github.io/OpenAI/

Usage

using var api = new OpenAiApi("API_KEY");
string response = await api.Chat.CreateChatCompletionAsync(
    messages: ["Generate five random words."],
    model: CreateChatCompletionRequestModel.Gpt4oMini);
Console.WriteLine(response); // "apple, banana, cherry, date, elderberry"

var enumerable = api.Chat.CreateChatCompletionAsStreamAsync(
    messages: ["Generate five random words."],
    model: CreateChatCompletionRequestModel.Gpt4oMini);

await foreach (string response in enumerable)
{
    Console.WriteLine(response);
}

It uses three implicit conversions:

  • from string to ChatCompletionRequestUserMessage. It will always be converted to the user message.
  • from ChatCompletionResponseMessage to string . It will always contain the first choice message content.
  • from CreateChatCompletionStreamResponse to string . It will always contain the first delta content.

You still can use the full response objects if you need more information, just replace string response to var response.

Tools

using OpenAI;
using CSharpToJsonSchema;

public enum Unit
{
    Celsius,
    Fahrenheit,
}

public class Weather
{
    public string Location { get; set; } = string.Empty;
    public double Temperature { get; set; }
    public Unit Unit { get; set; }
    public string Description { get; set; } = string.Empty;
}

[GenerateJsonSchema(Strict = true)] // false by default. You can't use parameters with default values in Strict mode.
public interface IWeatherFunctions
{
    [Description("Get the current weather in a given location")]
    public Task<Weather> GetCurrentWeatherAsync(
        [Description("The city and state, e.g. San Francisco, CA")] string location,
        Unit unit,
        CancellationToken cancellationToken = default);
}

public class WeatherService : IWeatherFunctions
{
    public Task<Weather> GetCurrentWeatherAsync(string location, Unit unit = Unit.Celsius, CancellationToken cancellationToken = default)
    {
        return Task.FromResult(new Weather
        {
            Location = location,
            Temperature = 22.0,
            Unit = unit,
            Description = "Sunny",
        });
    }
}

using var api = new OpenAiApi("API_KEY");

var service = new WeatherService();
var tools = service.AsTools().AsOpenAiTools();

var messages = new List<ChatCompletionRequestMessage>
{
    "You are a helpful weather assistant.".AsSystemMessage(),
    "What is the current temperature in Dubai, UAE in Celsius?".AsUserMessage(),
};
var model = CreateChatCompletionRequestModel.Gpt4oMini;
var result = await api.Chat.CreateChatCompletionAsync(
    messages,
    model: model,
    tools: tools);
var resultMessage = result.Choices.First().Message;
messages.Add(resultMessage.AsRequestMessage());

foreach (var call in resultMessage.ToolCalls)
{
    var json = await service.CallAsync(
        functionName: call.Function.Name,
        argumentsAsJson: call.Function.Arguments);
    messages.Add(json.AsToolMessage(call.Id));
}

var result = await api.Chat.CreateChatCompletionAsync(
    messages,
    model: model,
    tools: tools);
var resultMessage = result.Choices.First().Message;
messages.Add(resultMessage.AsRequestMessage());
> System: 
You are a helpful weather assistant.
> User: 
What is the current temperature in Dubai, UAE in Celsius?
> Assistant: 
call_3sptsiHzKnaxF8bs8BWxPo0B:
GetCurrentWeather({"location":"Dubai, UAE","unit":"celsius"})
> Tool(call_3sptsiHzKnaxF8bs8BWxPo0B):
{"location":"Dubai, UAE","temperature":22,"unit":"celsius","description":"Sunny"}
> Assistant: 
The current temperature in Dubai, UAE is 22°C with sunny weather.

Structured Outputs

using OpenAI;

using var api = new OpenAiApi("API_KEY");

var response = await api.Chat.CreateChatCompletionAsAsync<Weather>(
    messages: ["Generate random weather."],
    model: CreateChatCompletionRequestModel.Gpt4oMini,
    jsonSerializerOptions: new JsonSerializerOptions
    {
        Converters = {new JsonStringEnumConverter()},
    });
// or (if you need trimmable/NativeAOT version)
var response = await api.Chat.CreateChatCompletionAsAsync(
    jsonTypeInfo: SourceGeneratedContext.Default.Weather,
    messages: ["Generate random weather."],
    model: CreateChatCompletionRequestModel.Gpt4oMini);

// response.Value1 contains the structured output
// response.Value2 contains the CreateChatCompletionResponse object
Weather:
Location: San Francisco, CA
Temperature: 65
Unit: Fahrenheit
Description: Partly cloudy with a light breeze and occasional sunshine.
Raw Response:
{"Location":"San Francisco, CA","Temperature":65,"Unit":"Fahrenheit","Description":"Partly cloudy with a light breeze and occasional sunshine."}

Additional code for trimmable/NativeAOT version:

[JsonSourceGenerationOptions(Converters = [typeof(JsonStringEnumConverter<Unit>)])]
[JsonSerializable(typeof(Weather))]
public partial class SourceGeneratedContext : JsonSerializerContext;

Custom providers

using OpenAI;

using var api = CustomProviders.GitHubModels("GITHUB_TOKEN");
using var api = CustomProviders.Azure("API_KEY", "ENDPOINT");
using var api = CustomProviders.DeepInfra("API_KEY");
using var api = CustomProviders.Groq("API_KEY");
using var api = CustomProviders.XAi("API_KEY");
using var api = CustomProviders.DeepSeek("API_KEY");
using var api = CustomProviders.Fireworks("API_KEY");
using var api = CustomProviders.OpenRouter("API_KEY");
using var api = CustomProviders.Together("API_KEY");
using var api = CustomProviders.Perplexity("API_KEY");
using var api = CustomProviders.SambaNova("API_KEY");
using var api = CustomProviders.Mistral("API_KEY");
using var api = CustomProviders.Codestral("API_KEY");
using var api = CustomProviders.Cerebras("API_KEY");
using var api = CustomProviders.Ollama();
using var api = CustomProviders.LmStudio();

Constants

All tryGetXXX methods return null if the value is not found.
There also non-try methods that throw an exception if the value is not found.

using OpenAI;

// You can try to get the enum from string using:
var model = CreateChatCompletionRequestModelExtensions.ToEnum("gpt-4o") ?? throw new Exception("Invalid model");

// Chat
var model = CreateChatCompletionRequestModel.Gpt4oMini;
double? priceInUsd = model.TryGetPriceInUsd(
    inputTokens: 500,
    outputTokens: 500)
double? priceInUsd = model.TryGetFineTunePriceInUsd(
    trainingTokens: 500,
    inputTokens: 500,
    outputTokens: 500)
int contextLength = model.TryGetContextLength() // 128_000
int outputLength = model.TryGetOutputLength() // 16_000

// Embeddings
var model = CreateEmbeddingRequestModel.TextEmbedding3Small;
int? maxInputTokens = model.TryGetMaxInputTokens() // 8191
double? priceInUsd = model.TryGetPriceInUsd(tokens: 500)

// Images
double? priceInUsd = CreateImageRequestModel.DallE3.TryGetPriceInUsd(
    size: CreateImageRequestSize.x1024x1024,
    quality: CreateImageRequestQuality.Hd)

// Speech to Text
double? priceInUsd = CreateTranscriptionRequestModel.Whisper1.TryGetPriceInUsd(
    seconds: 60)

// Text to Speech
double? priceInUsd = CreateSpeechRequestModel.Tts1Hd.TryGetPriceInUsd(
    characters: 1000)

Support

Priority place for bugs: https://github.com/tryAGI/OpenAI/issues
Priority place for ideas and general questions: https://github.com/tryAGI/OpenAI/discussions
Discord: https://discord.gg/Ca2xhfBf3v

Acknowledgments

JetBrains logo

This project is supported by JetBrains through the Open Source Support Program.

CodeRabbit logo

This project is supported by CodeRabbit through the Open Source Support Program.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 is compatible. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 is compatible.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on tryAGI.OpenAI:

Package Downloads
LangChain.Providers.OpenAI

OpenAI API LLM and Chat model provider.

Anyscale

SDK for Anyscale Endpoint that makes it easy and cheap to use LLama 2

LangChain.Serve.OpenAI

LangChain Serve as OpenAI sdk compatible API

GitHub repositories (1)

Showing the top 1 popular GitHub repositories that depend on tryAGI.OpenAI:

Repository Stars
tryAGI/LangChain
C# implementation of LangChain. We try to be as close to the original as possible in terms of abstractions, but are open to new entities.
Version Downloads Last updated
3.9.3-dev.14 38 11/15/2024
3.9.3-dev.13 34 11/15/2024
3.9.3-dev.1 57 10/26/2024
3.9.2 649 10/26/2024
3.9.2-dev.7 114 10/25/2024
3.9.2-dev.6 31 10/25/2024
3.9.2-dev.4 38 10/25/2024
3.9.2-dev.1 148 10/24/2024
3.9.1 57 10/24/2024
3.8.2-dev.13 82 10/14/2024
3.8.2-dev.10 45 10/13/2024
3.8.2-dev.9 47 10/13/2024
3.8.1 891 10/9/2024
3.8.1-dev.14 63 10/2/2024
3.8.1-dev.13 50 10/2/2024
3.8.1-dev.6 42 9/28/2024
3.8.1-dev.3 47 9/28/2024
3.8.1-dev.2 43 9/26/2024
3.8.0 967 9/26/2024
3.7.1-dev.14 35 9/24/2024
3.7.1-dev.12 49 9/23/2024
3.7.1-dev.5 69 9/16/2024
3.7.1-dev.3 7,108 9/15/2024
3.7.0 309 9/14/2024
3.6.4-dev.10 54 9/14/2024
3.6.4-dev.9 50 9/14/2024
3.6.4-dev.7 42 9/13/2024
3.6.4-dev.5 54 9/9/2024
3.6.3 483 9/2/2024
3.6.2 119 9/1/2024
3.6.1 126 9/1/2024
3.5.2-dev.24 64 8/31/2024
3.5.2-dev.22 56 8/31/2024
3.5.2-dev.21 58 8/31/2024
3.5.2-dev.14 55 8/29/2024
3.5.2-dev.10 78 8/24/2024
3.5.2-dev.9 85 8/21/2024
3.5.2-dev.7 79 8/19/2024
3.5.2-dev.6 70 8/19/2024
3.5.2-dev.5 76 8/19/2024
3.5.2-dev.2 88 8/18/2024
3.5.1 734 8/18/2024
3.5.1-dev.4 72 8/18/2024
3.5.1-dev.3 80 8/18/2024
3.5.0 136 8/17/2024
3.4.2-dev.5 74 8/17/2024
3.4.2-dev.1 75 8/17/2024
3.4.1 227 8/16/2024
3.4.0 134 8/16/2024
3.3.1-dev.17 84 8/15/2024
3.3.1-dev.16 64 8/15/2024
3.3.1-dev.14 68 8/15/2024
3.3.1-dev.12 76 8/13/2024
3.3.1-dev.11 72 8/13/2024
3.3.1-dev.8 66 8/12/2024
3.3.1-dev.7 61 8/12/2024
3.3.1-dev.1 38 8/6/2024
3.3.0 294 8/6/2024
3.2.2-dev.3 34 8/6/2024
3.2.1 967 8/6/2024
3.2.1-dev.2 50 8/6/2024
3.2.1-dev.1 38 8/5/2024
3.2.0 86 8/5/2024
3.1.1-dev.1 38 8/4/2024
3.1.0 158 8/1/2024
3.0.4-dev.11 47 8/1/2024
3.0.4-dev.9 656 7/24/2024
3.0.4-dev.8 111 7/23/2024
3.0.4-dev.6 162 7/20/2024
3.0.4-dev.5 60 7/20/2024
3.0.4-dev.4 58 7/20/2024
3.0.4-dev.3 60 7/20/2024
3.0.4-dev.2 63 7/19/2024
3.0.4-dev.1 56 7/18/2024
3.0.3 155 7/18/2024
3.0.3-dev.1 50 7/18/2024
3.0.1 113 7/13/2024
3.0.0 89 7/11/2024
3.0.0-alpha.3 52 7/7/2024
3.0.0-alpha.2 50 7/6/2024
2.0.9 32,064 5/14/2024
2.0.8 143 4/29/2024
2.0.7 124 4/29/2024
2.0.6 9,309 4/22/2024
2.0.5 617 4/10/2024
2.0.4 357 4/4/2024
2.0.3 126 4/3/2024
2.0.2 202 4/3/2024
2.0.1 129 4/2/2024
2.0.0 302 3/22/2024
2.0.0-alpha.10 2,823 2/23/2024
2.0.0-alpha.9 17,801 1/27/2024
2.0.0-alpha.8 58 1/27/2024
2.0.0-alpha.7 101 1/20/2024
2.0.0-alpha.5 510 12/5/2023
2.0.0-alpha.4 187 11/16/2023
2.0.0-alpha.3 99 11/16/2023
2.0.0-alpha.2 66 11/16/2023
2.0.0-alpha.1 63 11/15/2023
2.0.0-alpha.0 63 11/15/2023
1.8.2 78,842 10/13/2023
1.8.1 446 10/11/2023
1.8.0 284 9/29/2023
1.7.2 1,650 8/15/2023
1.7.1 161 8/15/2023
1.7.0 330 8/15/2023
1.6.3 1,900 7/29/2023
1.6.1 161 7/29/2023
1.6.0 292 7/29/2023
1.5.0 547 7/13/2023
1.4.1 172 7/12/2023
1.4.0 166 7/12/2023
1.3.0 170 7/12/2023
1.2.0 588 7/5/2023
1.1.2 328 6/30/2023
1.1.1 167 6/30/2023
1.1.0 163 6/29/2023
1.0.0 177 6/28/2023
0.9.0 221 6/24/2023
0.0.0-dev.150 56 7/18/2024
0.0.0-dev 104 9/14/2024