One Pager Cheat Sheet
- An overview of LFU caching and its various
implementationswill be discussed in this tutorial. - The
LFU Cachealgorithmmeasures frequencyand replaces theblock with least frequencywhen the cache isfull. - LFU Cache
evictsblocks based on frequency,regardless of recent access, while LRU Cacheevictsblocks based on least recent access. - Design a data structure for an
LFU Cachewith operations toputandgetvalues, and a constructor to set the initialcapacitywhich also handlesreplacing the least frequently usedandleast recently usedkeys whenfull. - The
LFUcache evicts the least frequently used memory block, regardless of when it was last accessed, by keeping track of access counters for each key. - Implementing an LFU Cache using HashMap with O(1) access time and O(n) time to replace entries in a full cache.
- The main advantage of a HashMap is that it provides
constant timeaccess to items, regardless of the size of the map. - We can implement an LFU Cache using a Singly Linked List, which will allow us to access the element that has been least recently used in O(1) time, but the worst complexity for updating will be O(n).
- We can implement an LFU Cache using a Doubly Linked List, taking O(1) time for insertions/updates and lookups.
- It is much more efficient to use a Doubly Linked List to locate the least recently accessed entry with the lowest frequency in
O(1)time, as opposed to a Singly Linked List that requiresO(n)time. - We
implementedanLFU cacheusing HashMap, Singly Linked List, and Doubly Linked List.

