AsyncCache 1.0.39

Simple Async Cache implementation.

Install-Package AsyncCache -Version 1.0.39
dotnet add package AsyncCache --version 1.0.39
<PackageReference Include="AsyncCache" Version="1.0.39" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add AsyncCache --version 1.0.39
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

AsyncCache

NuGet version

Async cache uses an "async lock" (a semaphore) to prevent competing/simultaneous calls to the data source.

Usage

AsyncCache is not a singleton. If you want your cached key/values to be scoped as a singleton (also just generally speaking), I recommend using a dependency injector to instantiate AsyncCache.

The reason the cached key/values are scoped to the instance of AsyncCache is this way you can control both the TimeSpan and the scope of your key/values.

Usage Example Note: Don't use .Result from a task. Async functions should be called by async functions. This is just an example app to illustrate usage of AsyncCache. This example app is synchronous..so maybe a bad example but it should still illistrate the usage ok.

using Cache;
using System;
using System.Threading.Tasks;
        
namespace ConsoleApplication1
{
    class Program
    {
        static void Main(string[] args)
        {
            var random = new Random();
            var cache = new AsyncCache();
        
            Console.WriteLine("Not Cached:");
            for (var i = 0; i < 5; i++)
                Console.WriteLine(random.Next(0, 5));
        
            Console.WriteLine("\nCached:");
            for (var i = 0; i < 5; i++)
                Console.WriteLine(cache.Get(
                    key: "random integer",
                    dataSource: () => Task.FromResult(random.Next(0, 5))).Result);
        
            Console.ReadKey();
        }
    }
}

Example App Output:

    Not Cached:
    4
    2
    2
    4
    0
    
    Cached:
    2
    2
    2
    2
    2

Here is example usage from unit tests (below). Did I mention how simple it is?

[TestInitialize]
public void Setup()
{
    _cache = new AsyncCache();
}

[TestMethod]
public async Task ShouldReturnResultFromDataSourceTask()
{
    //Arrange
    //Act
    int result = await _cache.Get("some key", () => Task.FromResult(2));

    //Assert
    result.Should().Be(2);
}
        
[TestMethod]
public async Task ShouldLockToPreventRaceCondition()
{
    var tcs1 = new TaskCompletionSource<int>();
    var tcs2 = new TaskCompletionSource<int>();

    int? result1 = null;
    int? result2 = null;
    var get1 = _cache.Get(key: "key1", dataSource: () => tcs1.Task).ContinueWith(t => result1 = t.Result);
    var get2 = _cache.Get(key: "key1", dataSource: () => tcs2.Task).ContinueWith(t => result2 = t.Result);

    tcs1.SetResult(1);
    tcs2.SetResult(2);

    await Task.WhenAll(get1, get2);

    result1.Should().Be(1, "because this is the initial value inserted into the cache.");
    result2.Should().Be(1, "because the previous/parallel request should've already inserted 1");
}

AsyncCache

NuGet version

Async cache uses an "async lock" (a semaphore) to prevent competing/simultaneous calls to the data source.

Usage

AsyncCache is not a singleton. If you want your cached key/values to be scoped as a singleton (also just generally speaking), I recommend using a dependency injector to instantiate AsyncCache.

The reason the cached key/values are scoped to the instance of AsyncCache is this way you can control both the TimeSpan and the scope of your key/values.

Usage Example Note: Don't use .Result from a task. Async functions should be called by async functions. This is just an example app to illustrate usage of AsyncCache. This example app is synchronous..so maybe a bad example but it should still illistrate the usage ok.

using Cache;
using System;
using System.Threading.Tasks;
        
namespace ConsoleApplication1
{
    class Program
    {
        static void Main(string[] args)
        {
            var random = new Random();
            var cache = new AsyncCache();
        
            Console.WriteLine("Not Cached:");
            for (var i = 0; i < 5; i++)
                Console.WriteLine(random.Next(0, 5));
        
            Console.WriteLine("\nCached:");
            for (var i = 0; i < 5; i++)
                Console.WriteLine(cache.Get(
                    key: "random integer",
                    dataSource: () => Task.FromResult(random.Next(0, 5))).Result);
        
            Console.ReadKey();
        }
    }
}

Example App Output:

    Not Cached:
    4
    2
    2
    4
    0
    
    Cached:
    2
    2
    2
    2
    2

Here is example usage from unit tests (below). Did I mention how simple it is?

[TestInitialize]
public void Setup()
{
    _cache = new AsyncCache();
}

[TestMethod]
public async Task ShouldReturnResultFromDataSourceTask()
{
    //Arrange
    //Act
    int result = await _cache.Get("some key", () => Task.FromResult(2));

    //Assert
    result.Should().Be(2);
}
        
[TestMethod]
public async Task ShouldLockToPreventRaceCondition()
{
    var tcs1 = new TaskCompletionSource<int>();
    var tcs2 = new TaskCompletionSource<int>();

    int? result1 = null;
    int? result2 = null;
    var get1 = _cache.Get(key: "key1", dataSource: () => tcs1.Task).ContinueWith(t => result1 = t.Result);
    var get2 = _cache.Get(key: "key1", dataSource: () => tcs2.Task).ContinueWith(t => result2 = t.Result);

    tcs1.SetResult(1);
    tcs2.SetResult(2);

    await Task.WhenAll(get1, get2);

    result1.Should().Be(1, "because this is the initial value inserted into the cache.");
    result2.Should().Be(1, "because the previous/parallel request should've already inserted 1");
}

Dependencies

This package has no dependencies.

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
1.0.39 2,456 7/1/2018
1.0.38 323 3/14/2018
1.0.37 508 10/11/2017
1.0.34 367 3/16/2017
1.0.33 511 11/16/2016
1.0.32 465 11/16/2016
1.0.31 439 11/16/2016
1.0.30 543 6/16/2016
1.0.28 411 3/19/2016
1.0.27 487 3/5/2016
1.0.26 493 3/5/2016
1.0.25 483 2/20/2016
1.0.24 478 2/20/2016
1.0.23 504 2/20/2016
1.0.22 491 2/20/2016
1.0.21 489 2/16/2016
1.0.20 501 2/16/2016
1.0.19 501 2/16/2016
1.0.18 493 2/14/2016
1.0.1 490 2/14/2016
1.0.0 375 2/13/2016