DevKen.YoloPredictor
22.11.322.18
dotnet add package DevKen.YoloPredictor --version 22.11.322.18
NuGet\Install-Package DevKen.YoloPredictor -Version 22.11.322.18
<PackageReference Include="DevKen.YoloPredictor" Version="22.11.322.18" />
paket add DevKen.YoloPredictor --version 22.11.322.18
#r "nuget: DevKen.YoloPredictor, 22.11.322.18"
// Install DevKen.YoloPredictor as a Cake Addin #addin nuget:?package=DevKen.YoloPredictor&version=22.11.322.18 // Install DevKen.YoloPredictor as a Cake Tool #tool nuget:?package=DevKen.YoloPredictor&version=22.11.322.18
YoloPredictorMLDotNet
This project provide following packages:
|DevKen.YoloPredictor||
|DevKen.YoloPredictor.Yolov5||
|DevKen.YoloPredictor.OpenCvBridge||
What is this?
This project is designed to make YOLO intergration with .NET fast, easy and convenient. Programmers don't have to understand details about YOLO or ML, just feed the Predictor with trained moudle and images, then receive the results.
Use in critical projects?
DO NOT do that. This project is still under development and comes with NO guarantee.
Post issues if any problems are found.
Usage example
Predict on Bitmap
//Create a predictor by providing modulepath and a backend.
//Install corresponding OnnxRuntime nuget package.
//For example, you need Microsoft.ML.OnnxRuntime.Gpu for CUDA.
//If backend is not specified, Backend.Cpu will be used, since it is the most independent backend.
YoloPredictor predictor = new YoloPredictorV5(modulepath, backend:YoloPredictorV5.Backend.CUDA);
//Predict on a Bitmap, then apply NMS and Confidence filter.
var detresult = predictor.Predict((Bitmap)Bitmap.FromFile(picture)).NMSFilter().ConfidenceFilter();
Predict on Mat
Opencv read camera and run prediction.
//Create a predictor by providing modulepath and a backend.
YoloPredictor predictor = new YoloPredictorV5(modulepath, backend:YoloPredictorV5.Backend.CUDA);
//Open video device
VideoCapture vc = new VideoCapture(0);
while (true)
{
//If frame presents
if (vc.Read(image))
{
//Run detector on that frame, then apply NMSFilter and ConfidenceFilter.
var detresult = predictor.Predict(image).NMSFilter().ConfidenceFilter();
//** Do something with detresult here **
}
}
** If you encountered any problem with Backend.CUDA, try Backend.Cpu first. ** It works for most of the time. Verified that, you can install the correct version of Microsoft.ML.OnnxRuntime.Gpu, and maybe you have to install cuda and cudnn first.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.1 is compatible. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.1
- DevKen.BitmapExtentionsForOnnx (>= 22.11.321.4)
NuGet packages (2)
Showing the top 2 NuGet packages that depend on DevKen.YoloPredictor:
Package | Downloads |
---|---|
DevKen.YoloPredictor.Yolov5
Package Description |
|
DevKen.YoloPredictor.OpenCvBridge
Package Description |
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
22.11.322.18 | 726 | 11/18/2022 |
22.11.321.15 | 325 | 11/18/2022 |
22.11.319.2 | 675 | 11/16/2022 |