Encamina.Enmarcha.SemanticKernel.Plugins.Chat 8.1.3-preview-07

This is a prerelease version of Encamina.Enmarcha.SemanticKernel.Plugins.Chat.
There is a newer version of this package available.
See the version list below for details.
dotnet add package Encamina.Enmarcha.SemanticKernel.Plugins.Chat --version 8.1.3-preview-07                
NuGet\Install-Package Encamina.Enmarcha.SemanticKernel.Plugins.Chat -Version 8.1.3-preview-07                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Encamina.Enmarcha.SemanticKernel.Plugins.Chat" Version="8.1.3-preview-07" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Encamina.Enmarcha.SemanticKernel.Plugins.Chat --version 8.1.3-preview-07                
#r "nuget: Encamina.Enmarcha.SemanticKernel.Plugins.Chat, 8.1.3-preview-07"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Encamina.Enmarcha.SemanticKernel.Plugins.Chat as a Cake Addin
#addin nuget:?package=Encamina.Enmarcha.SemanticKernel.Plugins.Chat&version=8.1.3-preview-07&prerelease

// Install Encamina.Enmarcha.SemanticKernel.Plugins.Chat as a Cake Tool
#tool nuget:?package=Encamina.Enmarcha.SemanticKernel.Plugins.Chat&version=8.1.3-preview-07&prerelease                

Semantic Kernel - Chat Plugin

Nuget package

Chat Plugin is a project that provides Chat functionality in the form of a Semantic Kernel Plugin. It allows users to interact while chatting and asking questions to an Artificial Intelligence, usually a Large Language Model (LLM). Additionally, it stores the conversation history.

Setup

Nuget package

First, install NuGet. Then, install Encamina.Enmarcha.SemanticKernel.Plugins.Chat from the package manager console:

PM> Install-Package Encamina.Enmarcha.SemanticKernel.Plugins.Chat

.NET CLI:

First, install .NET CLI. Then, install Encamina.Enmarcha.SemanticKernel.Plugins.Chat from the .NET CLI:

dotnet add package Encamina.Enmarcha.SemanticKernel.Plugins.Chat

How to use

To use ChatWithHistoryPlugin, the usual approach is to import it as a plugin within Semantic Kernel. The simplest way to do this is by using the extension method ImportChatWithHistoryPluginUsingCosmosDb, which handles the import of the Plugin into Semantic Kernel. However, some previous configuration is required before importing it. First, you need to add the SemanticKernelOptions and ChatWithHistoryPluginOptions to your project configuration. You can achieve this by using any configuration provider. The followng code is an example of how the settings should look like using the appsettings.json file:

  {
    // ...
    "SemanticKernelOptions": {
        "ChatModelName": "gpt-35-turbo", // Name (sort of a unique identifier) of the model to use for chat
        "ChatModelDeploymentName": "gpt-35-turbo", // Model deployment name on the LLM (for example OpenAI) to use for chat
        "EmbeddingsModelName": "text-embedding-ada-002", // Name (sort of a unique identifier) of the model to use for embeddings
        "EmbeddingsModelDeploymentName": "text-embedding-ada-002", // Model deployment name on the LLM (for example OpenAI) to use for embeddings
        "Endpoint": "https://your-url.openai.azure.com/", // Uri for an LLM resource (like OpenAI). This should include protocol and hostname.
        "Key": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", // Key credential used to authenticate to an LLM resource
    },
    "ChatWithHistoryPluginOptions": {
        "HistoryMaxMessages": "gpt-35-turbo", // Name (sort of a unique identifier) of the model to use for chat
        "ChatRequestSettings": {
            "MaxTokens": 1000, // Maximum number of tokens to generate in the completion
            "Temperature": 0.8, // Controls the randomness of the completion. The higher the temperature, the more random the completion
            "TopP": 0.5, // Controls the diversity of the completion. The higher the TopP, the more diverse the completion.
        }
    },
    // ...
  }

Next, in Program.cs or a similar entry point file in your project, add the following code.

// Entry point
var builder = WebApplication.CreateBuilder(new WebApplicationOptions
{
   // ...
});

// ...

// Or others configuration providers...
builder.Configuration.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true);

// Requires Encamina.Enmarcha.SemanticKernel.Abstractions nuget
builder.Services.AddOptions<SemanticKernelOptions>().Bind(builder.Configuration.GetSection(nameof(SemanticKernelOptions)))
    .ValidateDataAnnotations()
    .ValidateOnStart();
builder.Services.AddOptions<ChatWithHistoryPluginOptions>().Bind(builder.Configuration.GetSection(nameof(ChatWithHistoryPluginOptions)))
    .ValidateDataAnnotations()
    .ValidateOnStart();

// Requieres Encamina.Enmarcha.Data.Cosmos
builder.Services.AddCosmos(builder.Configuration);

builder.Services.AddScoped(sp =>
{
    var kernel = new KernelBuilder()
        .WithAzureChatCompletionService("<YOUR DEPLOYMENT NAME>", "<YOUR AZURE ENDPOINT>", "<YOUR API KEY>")
        //.WithOpenAIChatCompletionService("<YOUR MODEL ID>", "<YOUR API KEY>", "<YOUR API KEY>")
        /// ...
        .Build();

    // ...

    string cosmosContainer = "cosmosDbContainer"; // You probably want to save this in the appsettings or similar

    kernel.ImportChatWithHistoryPluginUsingCosmosDb(sp, cosmosContainer, ILengthFunctions.LengthByTokenCount);

    return kernel;
});

Now you can inject the kernel via constructor, and the chat capabilities are already available.

public class MyClass
{
    private readonly Kernel kernel;

    public MyClass(Kernel kernel)
    {
        this.kernel = kernel;
    }

    public async Task TestChatAsync()
    {
        var contextVariables = new ContextVariables();
        contextVariables.Set(PluginsInfo.ChatWithHistoryPlugin.Functions.Chat.Parameters.Ask, "What is the weather like in Madrid?");
        contextVariables.Set(PluginsInfo.ChatWithHistoryPlugin.Functions.Chat.Parameters.UserId, "123456");
        contextVariables.Set(PluginsInfo.ChatWithHistoryPlugin.Functions.Chat.Parameters.UserName, "John Doe");
        contextVariables.Set(PluginsInfo.ChatWithHistoryPlugin.Functions.Chat.Parameters.Locale, "en");

        var functionChat = kernel.Func(PluginsInfo.ChatWithHistoryPlugin.Name, PluginsInfo.ChatWithHistoryPlugin.Functions.Chat.Name);

        var resultContext = await kernel.RunAsync(contextVariables, functionChat);
    }
}

Advanced configurations

Take into consideration that the above code uses a Cosmos DB implementation as IAsyncRepository as an example. You can use other implementations.

If you want to disable chat history, simply configure the HistoryMaxMessages de ChatWithHistoryPluginOptions with a value of 0.

You can also inherit from the ChatWithHistoryPlugin class and add the customizations you need.

public class MyCustomChatWithHistoryPlugin : ChatWithHistoryPlugin
{
    public MyCustomChatWithHistoryPlugin(Kernel kernel, string chatModelName, Func<string, int> tokensLengthFunction, IAsyncRepository<ChatMessageHistoryRecord> chatMessagesHistoryRepository, IOptionsMonitor<ChatWithHistoryPluginOptions> options)
        : base(kernel, chatModelName, tokensLengthFunction, chatMessagesHistoryRepository, options)
    {
    }

    protected override string SystemPrompt => "You are a Virtual Assistant who only talks about the weather.";

    // There are more overridable methods/properties
}
Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
8.2.0 223 10/22/2024
8.2.0-preview-01-m01 93 9/17/2024
8.1.9-preview-02 61 10/22/2024
8.1.9-preview-01 267 10/4/2024
8.1.8 219 9/23/2024
8.1.8-preview-07 478 9/12/2024
8.1.8-preview-06 213 9/11/2024
8.1.8-preview-05 73 9/10/2024
8.1.8-preview-04 353 8/16/2024
8.1.8-preview-03 172 8/13/2024
8.1.8-preview-02 96 8/13/2024
8.1.8-preview-01 96 8/12/2024
8.1.7 92 8/7/2024
8.1.7-preview-09 149 7/3/2024
8.1.7-preview-08 91 7/2/2024
8.1.7-preview-07 89 6/10/2024
8.1.7-preview-06 90 6/10/2024
8.1.7-preview-05 101 6/6/2024
8.1.7-preview-04 91 6/6/2024
8.1.7-preview-03 84 5/24/2024
8.1.7-preview-02 90 5/10/2024
8.1.7-preview-01 99 5/8/2024
8.1.6 1,080 5/7/2024
8.1.6-preview-08 58 5/2/2024
8.1.6-preview-07 88 4/29/2024
8.1.6-preview-06 363 4/26/2024
8.1.6-preview-05 85 4/24/2024
8.1.6-preview-04 102 4/22/2024
8.1.6-preview-03 89 4/22/2024
8.1.6-preview-02 121 4/17/2024
8.1.6-preview-01 180 4/15/2024
8.1.5 112 4/15/2024
8.1.5-preview-15 109 4/10/2024
8.1.5-preview-14 132 3/20/2024
8.1.5-preview-13 79 3/18/2024
8.1.5-preview-12 96 3/13/2024
8.1.5-preview-11 83 3/13/2024
8.1.5-preview-10 109 3/13/2024
8.1.5-preview-09 94 3/12/2024
8.1.5-preview-08 89 3/12/2024
8.1.5-preview-07 98 3/8/2024
8.1.5-preview-06 204 3/8/2024
8.1.5-preview-05 89 3/7/2024
8.1.5-preview-04 97 3/7/2024
8.1.5-preview-03 74 3/7/2024
8.1.5-preview-02 132 2/28/2024
8.1.5-preview-01 110 2/19/2024
8.1.4 149 2/15/2024
8.1.3 118 2/13/2024
8.1.3-preview-07 67 2/13/2024
8.1.3-preview-06 91 2/12/2024
8.1.3-preview-05 102 2/9/2024
8.1.3-preview-04 87 2/8/2024
8.1.3-preview-03 81 2/7/2024
8.1.3-preview-02 86 2/2/2024
8.1.3-preview-01 80 2/2/2024
8.1.2 120 2/1/2024
8.1.2-preview-9 103 1/22/2024
8.1.2-preview-8 85 1/19/2024
8.1.2-preview-7 86 1/19/2024
8.1.2-preview-6 83 1/19/2024
8.1.2-preview-5 93 1/19/2024
8.1.2-preview-4 83 1/19/2024
8.1.2-preview-3 87 1/18/2024
8.1.2-preview-2 85 1/18/2024
8.1.2-preview-16 84 1/31/2024
8.1.2-preview-15 81 1/31/2024
8.1.2-preview-14 187 1/25/2024
8.1.2-preview-13 84 1/25/2024
8.1.2-preview-12 95 1/23/2024
8.1.2-preview-11 104 1/23/2024
8.1.2-preview-10 92 1/22/2024
8.1.2-preview-1 77 1/18/2024
8.1.1 135 1/18/2024
8.1.0 101 1/18/2024
8.0.3 152 12/29/2023
8.0.1 128 12/14/2023
8.0.0 154 12/7/2023
6.0.4.3 152 12/29/2023
6.0.4.2 146 12/20/2023
6.0.4.1 212 12/19/2023
6.0.4 170 12/4/2023
6.0.3.20 143 11/27/2023
6.0.3.19 145 11/22/2023