Convsharp 1.0.0
dotnet add package Convsharp --version 1.0.0
NuGet\Install-Package Convsharp -Version 1.0.0
<PackageReference Include="Convsharp" Version="1.0.0" />
paket add Convsharp --version 1.0.0
#r "nuget: Convsharp, 1.0.0"
// Install Convsharp as a Cake Addin #addin nuget:?package=Convsharp&version=1.0.0 // Install Convsharp as a Cake Tool #tool nuget:?package=Convsharp&version=1.0.0
Simple library for creating sequential models of convolutional neural networks. Library is for the solving small problems because is aimed on the readable code and modularity more than performance.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp2.0 is compatible. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
-
.NETCoreApp 2.0
- Veldrid.ImageSharp (>= 4.1.4)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
1.0.0 | 1,363 | 4/22/2018 |
First release of library:
- 1D and 2D convolutional layer
- 1D and 2D maxpooling layer
- flatten layer
- linear layer
- activation layer (ReLU, Tanh, Sigmoid, Softmax)
- loss functions (binary cross-entropy, categorical cross-entropy, MSE)
- optimalization algorithms (mini-batch SGD, Adam)
- regularization techniques: Dropout, L2 regularization