Atc.SemanticKernel.Connectors.Ollama 1.0.35

dotnet add package Atc.SemanticKernel.Connectors.Ollama --version 1.0.35
NuGet\Install-Package Atc.SemanticKernel.Connectors.Ollama -Version 1.0.35
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Atc.SemanticKernel.Connectors.Ollama" Version="1.0.35" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Atc.SemanticKernel.Connectors.Ollama --version 1.0.35
#r "nuget: Atc.SemanticKernel.Connectors.Ollama, 1.0.35"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Atc.SemanticKernel.Connectors.Ollama as a Cake Addin
#addin nuget:?package=Atc.SemanticKernel.Connectors.Ollama&version=1.0.35

// Install Atc.SemanticKernel.Connectors.Ollama as a Cake Tool
#tool nuget:?package=Atc.SemanticKernel.Connectors.Ollama&version=1.0.35

Introduction

This repository contains various connectors and plugins for integrating Large Language Models (LLMs) via Semantic Kernel.

Table of Content

SemanticKernel Connectors

Atc.SemanticKernel.Connectors.Ollama

NuGet Version

The Atc.SemanticKernel.Connectors.Ollama package contains a connector for integrating with Ollama .

It supports the following capabilities by implementing the interfaces (IChatCompletionService, ITextGenerationService, ITextEmbeddingGenerationService)

Note: Embedding generation is marked as experimental in Semantic Kernel

Wire-Up Using KernelBuilder/ServiceCollection Extensions

To seamlessly integrate Ollama services into your application, you can utilize the provided KernelBuilder and ServiceCollection extension methods. These methods simplify the setup process and ensure that the Ollama services are correctly configured and ready to use within your application's service architecture.

Both methods ensure that the Ollama services are added to the application's service collection and configured according to the specified parameters, making them available throughout your application via dependency injection.

The configuration examples below utilizes the application's settings (typically defined in appsettings.json) to configure each Ollama service with appropriate endpoints and model identifiers.

Setup with KernelBuilder

The KernelBuilder extensions allow for fluent configuration and registration of services. Here’s how you can wire up Ollama services using KernelBuilder:

var builder = WebApplication.CreateBuilder(args);

// Add Kernel services
builder.Services.AddKernel()
    .AddOllamaTextGeneration(
        builder.Configuration["Ollama:Endpoint"],
        builder.Configuration["Ollama:Model"])
    .AddOllamaChatCompletion(
        builder.Configuration["Ollama:Endpoint"],
        builder.Configuration["Ollama:Model"])
    .AddOllamaTextEmbeddingGeneration(
        builder.Configuration["Ollama:Endpoint"],
        builder.Configuration["Ollama:Model"]);
Setup with ServiceCollection

Alternatively, if you're configuring services directly through IServiceCollection, here's how you can add Ollama services:

var builder = WebApplication.CreateBuilder(args);

// Configure Ollama services directly
builder.Services
    .AddOllamaTextGeneration(
        builder.Configuration["Ollama:Endpoint"],
        builder.Configuration["Ollama:Model"])
    .AddOllamaChatCompletion(
        builder.Configuration["Ollama:Endpoint"],
        builder.Configuration["Ollama:Model"])
    .AddOllamaTextEmbeddingGeneration(
        builder.Configuration["Ollama:Endpoint"],
        builder.Configuration["Ollama:Model"]);

Examples

Using the OllamaChatCompletionService

To utilize the OllamaChatCompletionService for chat completions, you can initialize it with an OllamaApiClient or a custom API endpoint, and a model ID. Below is an example of how to use the service to handle chat sessions.

var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();
var history = new ChatHistory();

// Add needed messages to current chat history
history.AddSystemMessage("...");
history.AddUserMessage(input);

var chatResponse = await chat.GetChatMessageContentsAsync(history);
Console.WriteLine(chatResponse[^1].Content);
Using the OllamaTextGenerationService

The OllamaTextGenerationService offers text generation capabilities using a specified model. This service can be initialized using an OllamaApiClient or a custom API endpoint, and a model ID. Below is an example of how to use the service to handle text generation.

var textGen = kernel.GetRequiredService<ITextGenerationService>();
var response = await textGen.GetTextContentsAsync("The weather in January in Denmark is usually ");
Console.WriteLine(response[^1].Text);
Using the OllamaTextEmbeddingGenerationService

The OllamaTextEmbeddingGenerationService provides functionality to generate text embeddings. This service can be initiated with an OllamaApiClient or a custom endpoint, and model ID. Below is an example of how to use the service to handle text generation.

#pragma warning disable SKEXP0001
var embeddingGenerationService = kernel.GetRequiredService<ITextEmbeddingGenerationService>();
#pragma warning restore SKEXP0001

List<string> texts = ["Hello"];

var embeddings = await embeddingGenerationService.GenerateEmbeddingsAsync(texts);
Console.WriteLine($"Embeddings Length: {embeddings.Count}");

Requirements

How to contribute

Contribution Guidelines

Coding Guidelines

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
1.0.35 96 4/30/2024
1.0.33 88 4/30/2024