LRUCache

Introduction

LRU (least recently used) cache. Determines whether objects should be continuously cached based on the time of use. When an object is accessed, it is placed in the cache. When the cache is full, the object that has not been used for the longest time will be removed. This cache is based on LinkedHashMap, so when a cached object is accessed once, the key of this object goes to the head of the linked list. This algorithm is simple and very fast. It has a significant advantage over FIFO in that frequently used objects are less likely to be removed from the cache. The disadvantage is that when the cache is full, it cannot be accessed quickly.

Usage

Cache<String, String> lruCache = CacheUtil.newLRUCache(3);
// Create an instance of the object
// LRUCache<String, String> lruCache = new LRUCache<String, String>(3);
lruCache.put("key1", "value1", DateUnit.SECOND.getMillis() * 3);
lruCache.put("key2", "value2", DateUnit.SECOND.getMillis() * 3);
lruCache.put("key3", "value3", DateUnit.SECOND.getMillis() * 3);
lruCache.get("key1"); // Access time is pushed forward
lruCache.put("key4", "value4", DateUnit.SECOND.getMillis() * 3);

// Since the cache capacity is only 3, when adding the fourth element, according to the LRU rule, the least recently used one will be removed (2 will be removed)
String value2 = lruCache.get("key2"); // null