LRU cache thread safe implementation
Hello,
I have a basic implementation of a LRU cache, but it's not thread safe. A dictionary has a thread safe implementation, but not a list. Currently I haven't used the lock.
There are 2 options:
1. Use
ConcurrentDictionary
, and use the lock only for operations on the list (not atomic, but could anything go wrong) ?
2. Use Dictionary
, and use the lock for both operations (this is atomic for sure)
I kinda feel like the 2nd option is better, what do you think?
1 Reply
Or is there a concurrent list implementation that I'm not aware of?