OnnxStack.Core 0.39.0

Prefix Reserved
dotnet add package OnnxStack.Core --version 0.39.0                
NuGet\Install-Package OnnxStack.Core -Version 0.39.0                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="OnnxStack.Core" Version="0.39.0" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add OnnxStack.Core --version 0.39.0                
#r "nuget: OnnxStack.Core, 0.39.0"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install OnnxStack.Core as a Cake Addin
#addin nuget:?package=OnnxStack.Core&version=0.39.0

// Install OnnxStack.Core as a Cake Tool
#tool nuget:?package=OnnxStack.Core&version=0.39.0                

OnnxStack.Core - Onnx Services for .NET Applications

OnnxStack.Core is a library that provides simplified wrappers for OnnxRuntime

Getting Started

OnnxStack.Core can be found via the nuget package manager, download and install it.

PM> Install-Package OnnxStack.Core

Dependencies

Video processing support requires FFMPEG and FFPROBE binaries, files must be present in your output folder or the destinations configured at runtime

https://ffbinaries.com/downloads
https://github.com/ffbinaries/ffbinaries-prebuilt/releases/download/v6.1/ffmpeg-6.1-win-64.zip
https://github.com/ffbinaries/ffbinaries-prebuilt/releases/download/v6.1/ffprobe-6.1-win-64.zip

OnnxModelSession Example


// CLIP Tokenizer Example
//----------------------//

// Model Configuration
var config = new OnnxModelConfig
{
    DeviceId = 0,
    InterOpNumThreads = 0,
    IntraOpNumThreads = 0,
    ExecutionMode = ExecutionMode.ORT_SEQUENTIAL,
    ExecutionProvider = ExecutionProvider.DirectML,
    OnnxModelPath = "cliptokenizer.onnx"
};

// Create Model Session
var modelSession = new OnnxModelSession(config);

// Get Metatdata
var modelMetadata = await modelSession.GetMetadataAsync();

// Create Input Tensor
var text = "Text To Tokenize";
var inputTensor = new DenseTensor<string>(new string[] { text }, new int[] { 1 });

// Create Inference Parameters
using (var inferenceParameters = new OnnxInferenceParameters(modelMetadata))
{
    // Set Inputs and Outputs
    inferenceParameters.AddInputTensor(inputTensor);
    inferenceParameters.AddOutputBuffer();

    // Run Inference
    using (var results = modelSession.RunInference(inferenceParameters))
    {
        // Extract Result Tokens
        var resultData = results[0].ToArray<long>();
    }
}

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on OnnxStack.Core:

Package Downloads
OnnxStack.StableDiffusion

Stable Diffusion Library for .NET

OnnxStack.ImageUpscaler

OnnxRuntime Image Upscale Library for .NET

OnnxStack.FeatureExtractor

OnnxRuntime Image Feature Extractor Library for .NET

GitHub repositories (1)

Showing the top 1 popular GitHub repositories that depend on OnnxStack.Core:

Repository Stars
TensorStack-AI/OnnxStack
C# Stable Diffusion using ONNX Runtime
Version Downloads Last updated
0.39.0 234 6/12/2024
0.31.0 263 4/25/2024
0.27.0 179 3/31/2024
0.25.0 171 3/14/2024
0.23.0 173 2/29/2024
0.22.0 128 2/23/2024
0.21.0 151 2/15/2024
0.19.0 145 2/1/2024
0.17.0 179 1/18/2024
0.16.0 129 1/11/2024
0.15.0 199 1/5/2024
0.14.0 153 12/27/2023
0.13.0 127 12/22/2023
0.12.0 136 12/15/2023
0.10.0 163 11/30/2023
0.9.0 140 11/23/2023
0.8.0 188 11/16/2023
0.7.0 139 11/9/2023
0.6.0 123 11/2/2023
0.5.0 131 10/27/2023
0.4.0 112 10/19/2023
0.3.1 133 10/9/2023
0.3.0 109 10/9/2023
0.2.0 114 10/3/2023
0.1.0 159 9/25/2023