LRU Cache
If you have a tight loop, LRU is going to be perfect as long as the loop fits in cache, but it’s going to cause a miss every time if the loop doesn’t fit. A random eviction policy degrades gracefully as the loop gets too big. - Caches: LRU v. random / HN
see also
- The power of two random choices / HN / twitter - show that considering only the best of 2 random pick for cache eviction, gives pretty good results. / this would work as well for other job, like load balancing among N servers.
- Linux Kernel: The multi-generational LRU / HN
- 146. LRU Cache
- Why Aren’t We Sieve-Ing? - SIEVE is an eviction algorithm, a way of deciding which cached item to toss out when a new one needs to be put in.
- Caffeine / HN - high performance caching library
Written on April 27, 2021, Last update on April 27, 2024
algorithm
cache
leetcode
random