overview

Source

The Hutool-cache module was originally inspired by jodd-cache (most of the logic is still consistent with jodd now), which provides a simple implementation of caching. It is very useful for small projects with simple caching requirements.

Introduction

The Hutoo-cache module provides several caching strategy implementations:

FIFOCache

FIFO (first in first out) strategy. Elements are continuously added to the cache until it is full. When the cache is full, expired cache objects are cleaned up. If the cache is still full after cleanup, the first-in cache object (the object at the head of the linked list) is deleted.

Advantages: Simple and fast Disadvantages: Inflexible, cannot guarantee that the most commonly used objects are always retained

LFUCache

LFU (least frequently used) strategy. Determines whether objects should be continuously cached based on the number of accesses (usage rate is calculated through access counts). When the cache is full, expired objects are cleaned up. If the cache is still full after cleanup, the object with the fewest accesses (the smallest access count) is removed, and the access counts of other objects are decreased by this minimum access count to allow new objects to enter and count fairly.

LRUCache

LRU (least recently used) strategy. Determines whether objects should be continuously cached based on the time of use. When an object is accessed, it is placed in the cache. When the cache is full, the object that has not been used for the longest time is removed. This cache is based on LinkedHashMap, so when a cached object is accessed, its key goes to the head of the linked list. This algorithm is simple and very fast. It has a significant advantage over FIFO in that frequently used objects are less likely to be removed from the cache. The disadvantage is that when the cache is full, it cannot be accessed quickly.

TimedCache

Timed caching. Defines an expiration time for cached objects. When an object exceeds its expiration time, it is cleaned up. This cache has no capacity limit, and objects are only removed after they expire.

WeakCache

Weak reference caching. For a given key, the existence of its mapping does not prevent the garbage collector from discarding the key, making it terminable, terminated, and then recycled. When a key is discarded, its entry is effectively removed from the mapping. This class uses WeakHashMap as its implementation, and the cleanup of the cache depends on the garbage collection of the JVM.


FileCache

FileCache is an independent cache that mainly caches small files in byte[] form into the content to reduce file access and solve performance problems caused by frequent file reads.

The main implementations are:

  • LFUFileCache
  • LRUFileCache