I am trying to implement a concurrent LRU-based cache. In order to do so
I need somehow to keep track of the least recently used objects.
Normally this is no problem because you easily use an index-based LRU
structure and support o(1) removal/insertion. However, since I want to
avoid locking I need to do something else.
I was thinking of adapting ConcurrentLinkedQueue a bit.
1. Make ConcurrentLinkedQueue.Node public (or least package private) to
have a direct reference to it in order to allow fast removal.
2. Whenever an item in the cache is accessed get hold of the
corresponding node in the LRU-queue from the given cache entry and use
if (node.casItem(cacheValue, null))
Node newNode = lruQueue.add(item);
update volatile reference in the cache entry to newNode
There are two main downsides to this idea.
1. I need to maintain an almost identical copy of ConcurrentLinkedQueue.
2. I need to create a new Node on every access to the cache.
anyone can come up with something better/easier/faster?