NullGC.Allocators
0.4.0
See the version list below for details.
dotnet add package NullGC.Allocators --version 0.4.0
NuGet\Install-Package NullGC.Allocators -Version 0.4.0
<PackageReference Include="NullGC.Allocators" Version="0.4.0" />
paket add NullGC.Allocators --version 0.4.0
#r "nuget: NullGC.Allocators, 0.4.0"
// Install NullGC.Allocators as a Cake Addin #addin nuget:?package=NullGC.Allocators&version=0.4.0 // Install NullGC.Allocators as a Cake Tool #tool nuget:?package=NullGC.Allocators&version=0.4.0
NullGC
High performance unmanaged memory allocator / collection types / LINQ provider for .NET Core. Most suitable for game development since there will be no latency jitter caused by .NET garbage collection activities. Benchmark Results (Auto updated by GitHub Actions)
Motivation
This project was born mostly out of my curiosity on how far can it go to entirely eliminate garbage collection, also as a side project emerged from an ongoing game engine development. Although .NET background GC is already good at hiding GC stops, still there are some. Also for throughput focused scenarios, there may be huge performance potential when GC is completely out of the equation according to my previous experience on realtime data processing.
Usage
Currently this project contains 3 components:
- Unmanaged memory allocator
- Value type (struct) only collections
- Linq operators
Setup
Install NuGet package
NullGC.Allocators
andNullGC.Linq
Setup AllocatorContext:
AllocatorContext.SetImplementation(new DefaultAllocatorContextImpl().ConfigureDefault());
Allocator context is used internally in ValueArray<T>
and any code that needs to allocate unmanaged memory.
Custom collection types
The following types all use unmanaged memory as their internal state store.
- ValueArray<T>
- ValueList<T>
- ValueStack<T>
- ValueQueue<T>
- ValueDictionary<TKey, TValue>
- ValueLinkedList<T>
- ValueFixedSizeDeque<T> (Circular buffer)
- SlidingWindow<T>
- SlidingTimeWindow<T>
*All collection types can be enumerated by ref (foreach(ref var item in collection)
)
Memory allocation strategies
Two types of memory allocation strategy are supported:
1. Arena
// all 'T' is struct if not specifically mentioned.
using (AllocatorContext.BeginAllocationScope())
{
// plain old 'new'
var list = new ValueList<T>();
var dict = new ValueDictionary<TKey, TValue>();
// let struct T work like a class (T is allocated on the unmanaged heap.)
var obj = new Allocated<T>();
...
} // all value objects are automatically disposed as they go out of scope,
// no need to explicitly call Dispose().
2. Explicit lifetime
You can utilize the unmanaged allocator anywhere like this (including inside of arena scope):
// use the overload with parameter 'AllocatorTypes', specify the unscoped, globally available allocator type.
var list = new ValueList<T>(AllocatorTypes.DefaultUnscoped);
var obj = new Allocated<T>(AllocatorTypes.DefaultUnscoped);
...
// Anywhere after usage:
list.Dispose();
obj.Dispose();
// As long as 'list' is the original one not a copy, it can be disposed saftely multiple times.
list.Dispose(); // ok.
Double-free problem when using explicit lifetime
First of all, collections with unscoped allocator type needs to call Dispose()
to free the unmanaged memory allocated inside. Typical Dispose()
implementation will be: If the unmanaged pointer is NULL, ignore; If not NULL, pass the pointer to the native free() function and reset the pointer to NULL. This seems to prevent the double-free, but is it?
struct BadGuy : IDisposable {
private ValueList<T> _lst;
public BadGuy(ValueList<T> lst){
_lst = lst; // '_lst' is a copy of 'list' below.
}
public void Dispose() {
_lst.Dispose();
// '_lst' is now in disposed state, but not the 'list' below,
}
}
var list = new ValueList<T>(AllocatorTypes.DefaultUnscoped);
using (var badGuy = new BadGuy(list)) {
...
} // 'badGuy' is dead, however..
...
list.Dispose(); // Since 'list' is NOT in the disposed state, this will cause the double-free.
So to prevent double-free, when these value collections are passed by value, Borrow()
(from interface ISingleDisposable<TSelf>
) should be used.
void SomeMethod(ValueList<T> lst){ // 'lst' is a copy of 'list' below.
lst.Dispose(); // Does nothing.
}
var list = new ValueList<T>(AllocatorTypes.DefaultUnscoped);
SomeMethod(list.Borrow()); // The borrowed one's Dispose() is a no-op.
list.Dispose(); // Ok.
Interop with managed object
If you have to use managed object (i.e. class) inside a struct, you can use
Pinned<T>
to pin the object down so that its address is fixed and can be stored on a non-GC rooted place.
*Since .NET 5 there's a specific heap type for pinned object called POH, the performance impact will be quite low.
Linq
The fastest LINQ provider as of today (2024.1). Benchmark Results (compared with Built-in/LinqGen/RefLinq/HyperLinq)
The extreme performance boils down to:
- minimize struct copy by aggressive inlining and use ref modifier.
- No boxing (except for some case of interface casting that cannot be optimized away).
- Exploit the traits of previous stage as much as possible. (e.g. if the previous of OrderBy is
IAddressFixed
, we can store the pointer instead of the whole struct)
Proper usage is with the built-in value typed collections, but good old IEnumerable<T> is also supported. You can still get some benefit on LINQ operators that need to buffer data such as OrderBy. The LINQ interface has 2 variations:
SomeCollection.LinqValue()... // Enumerate by value. All types implement IEnumerable<T> are supported
SomeCollection.LinqRef()... // Enumerate by ref. Besides value collections, only collection with Enumerator implemented `ILinqRefEnumerator<T>` are supported (e.g. normal array types)
Most extension methods that needs a delegate type parameter has an overloads with in
or ref
modifier to avoid copying too much data if the Linqed type is a big struct.
// T is something BIG.
...Where((in T x) => ...).Select((ref T x) => ref x.SomeField)... // all reference, no copy of T
Most extension methods has overloads with a TArg arg
parameter to avoid unnecessary variable capture, thus avoids the allocation of a capture object.
TArg someTArg;
...Select((in T x) => new SomeStruct(in x, someTArg))... // Everytime this line executes, a new capture object for `someTArg` must be allocated on the managed heap.
...Select(static (in T x, TArg a) => new SomeStruct(in x, a), someTArg)... // No capture is happening. ('static' is not mandatory, just a explicit declaration)
Things to do
- More documentations.
- Larger test coverage.
- More collection types.
- More LINQ providers and support range.
- Roslyn analyzer for struct lifetime/ownership enforcing. (The actual lifetime is not being enforced, such as the early dispose from the owner side or mutation from the borrower side is still unpreventable, static analysis with attribute markers should be the way to go.)
Thanks to
Emma Maassen from https://github.com/Enichan/Arenas
Angouri from https://github.com/asc-community/HonkPerf.NET
Details in THIRD-PARTY-NOTICES.md
How to contribute
Framework projects like this will not become generally useful without being battle tested in real world. If your project can protentially benefit from this library, feel free to submit an Issue and talk about your use case. Any type of contributions are welcomed.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net8.0
- NullGC.Abstractions (>= 0.4.0)
- NullGC.Collections (>= 0.4.0)
- NullGC.Linq (>= 0.4.0)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.