Awesome
⚡ BitFaster.Caching
High performance, thread-safe in-memory caching primitives for .NET.
Features
- ConcurrentLru, a lightweight pseudo LRU based on the 2Q eviction policy. Also with time based eviction.
- ConcurrentLfu, an approximate LFU based on the W-TinyLFU admission policy.
- Configurable atomic valueFactory to mitigate cache stampede.
- Configurable thread-safe wrappers for IDisposable cache values.
- A builder API to easily configure cache features.
- SingletonCache for caching single instance values, such as lock objects.
- High performance concurrent counters.
Documentation
Please refer to the wiki for full API documentation, and a complete analysis of hit rate, latency and throughput.
Getting started
BitFaster.Caching is installed from NuGet:
dotnet add package BitFaster.Caching
ConcurrentLru
ConcurrentLru
is a light weight drop in replacement for ConcurrentDictionary
, but with bounded size enforced by the TU-Q eviction policy (derived from 2Q). There are no background threads, no global locks, concurrent throughput is high, lookups are fast and hit rate outperforms a pure LRU in all tested scenarios.
Choose a capacity and use just like ConcurrentDictionary
, but with bounded size:
int capacity = 128;
var lru = new ConcurrentLru<string, SomeItem>(capacity);
var value = lru.GetOrAdd("key", (key) => new SomeItem(key));
ConcurrentLfu
ConcurrentLfu
is a drop in replacement for ConcurrentDictionary
, but with bounded size enforced by the W-TinyLFU eviction policy. ConcurrentLfu
has near optimal hit rate and high scalability. Reads and writes are buffered then replayed asynchronously to mitigate lock contention.
Choose a capacity and use just like ConcurrentDictionary
, but with bounded size:
int capacity = 128;
var lfu = new ConcurrentLfu<string, SomeItem>(capacity);
var value = lfu.GetOrAdd("key", (key) => new SomeItem(key));