class LRUCache[Key >: Null <: AnyRef, Value >: Null]
A least-recently-used cache for Key -> Value computations It currently keeps the last 8 associations, but this can be changed to anywhere between 2 and 16 by changing LRUCache.Retained.
Implementation: We keep a ring of eight places, linked with the next data structure. The ring models a priority queue. last points to the last element of the queue, and next(last) to the first one. Lookups compare keys sequentially from first to last. Elements with successful lookup get promoted to be first in the queue. Elements are evicted at the last position.
Enter key/value in cache at position last. As a side effect, sets last to lastButOne. If lastButOne was set by a preceding unsuccessful lookup for the same key, this means that the new element is now the first in the queue. If there was no preceding lookup, the element is inserted at a random position in the queue.
Enter key/value in cache at position last. As a side effect, sets last to lastButOne. If lastButOne was set by a preceding unsuccessful lookup for the same key, this means that the new element is now the first in the queue. If there was no preceding lookup, the element is inserted at a random position in the queue.