NightlyCode.Ai
0.4.13-preview
See the version list below for details.
dotnet add package NightlyCode.Ai --version 0.4.13-preview
NuGet\Install-Package NightlyCode.Ai -Version 0.4.13-preview
<PackageReference Include="NightlyCode.Ai" Version="0.4.13-preview" />
paket add NightlyCode.Ai --version 0.4.13-preview
#r "nuget: NightlyCode.Ai, 0.4.13-preview"
// Install NightlyCode.Ai as a Cake Addin #addin nuget:?package=NightlyCode.Ai&version=0.4.13-preview&prerelease // Install NightlyCode.Ai as a Cake Tool #tool nuget:?package=NightlyCode.Ai&version=0.4.13-preview&prerelease
This is a preview package and is expected to make breaking changes like every other release.
NightlyCode.Ai
With this package you can build and train neuronal nets.
DynamicBinOp
The best working implementation in this package currently is the DynamicBinOp net which features a self growing net which trains by mutating its elements and generating new neurons from time to time.
Usage
The most basic case to use a dynamicbinopnet would be to create one manually.
DynamicBinOpNet net=new(new(["x"], ["y"]))
This would create a net with one input neuron "x" and an output neuron "y". While this is a working neuronal net it doesn't do anything meaningful as there are no connections in it.
To have a working neuronal net you first need to train a configuration.
Training
Training is done by a population which hopefully evolves to contain a working configuration at some point.
Population<DynamicBinOpConfiguration> population = new(100, rng => new(["x"], ["y"], rng));
Then a setup is needed which contains the training samples used to train the configurations.
EvolutionSetup<DynamicBinOpConfiguration> setup = new() {
TrainingSet = [...],
Runs = 5000,
Threads = 2
};
The TrainingSet property is a collection of test cases which return a deviation value based on a configuration. Usually one would feed a neuronal net with the provided configuration, fill the input values of the net and then compute the results. Then a good return would be the average or summed up deviation of the result values to the expected values.
A good template for a test func for the upper case would be
ConcurrentStack<DynamicBinOpNet> netStack = new();
float Test(DynamicBinOpConfiguration config, float x, float expected) {
if (!netStack.TryPop(out DynamicBinOpNet net))
net = new(config);
else net.Update(config);
net["x"] = x;
net.Compute();
float result = net["y"];
return Math.Abs(expected - result);
}
When the setup is created with samples the population can be trained with
Tuple<DynamicBinOpConfiguration, double> result=population.Train(setup);
The population returns the best configuration based on the training set after the number of runs are completed or a threshold is reached. This configuration can be used to feed a neuronal net and compute values.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net8.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
0.5.3-preview | 87 | 10/8/2024 |
0.5.2-preview | 59 | 10/8/2024 |
0.5.1-preview | 49 | 10/7/2024 |
0.5.0-preview | 47 | 10/7/2024 |
0.4.14-preview | 66 | 10/5/2024 |
0.4.13-preview | 58 | 10/5/2024 |
0.4.12-preview | 65 | 10/4/2024 |
0.4.11-preview | 64 | 10/4/2024 |
0.4.10-preview | 70 | 10/3/2024 |
0.4.9-preview | 56 | 10/2/2024 |
0.4.8-preview | 51 | 10/2/2024 |
0.4.7-preview | 61 | 10/2/2024 |
0.4.6-preview | 59 | 10/1/2024 |
0.4.5-preview | 53 | 10/1/2024 |
0.4.4-preview | 60 | 10/1/2024 |
0.4.3-preview | 56 | 9/30/2024 |
0.4.1-preview | 62 | 9/29/2024 |
0.4.0-preview | 57 | 9/29/2024 |
0.3.2-preview | 59 | 9/24/2024 |
0.3.1-preview | 126 | 5/17/2024 |
0.3.0-preview | 144 | 2/23/2024 |
0.2.10-preview | 62 | 2/16/2024 |
0.2.7-preview | 74 | 2/15/2024 |
0.2.6-preview | 70 | 2/14/2024 |
0.2.5-preview | 58 | 2/14/2024 |
0.2.4-preview | 68 | 2/14/2024 |
0.2.3-preview | 65 | 2/14/2024 |
0.2.2-preview | 75 | 2/13/2024 |