Least Recently Used (LRU) caching is a cache replacement policy that removes the least recently used items first. This algorithm is often used for memory management or Disk I/O, similar to how an OS operates. The purpose is to maximize data that is hot in cache and reduce expensive fetch operations.
Consider a finance application where frequent stock market updates occur. Some stocks (e.g., AAPL, TSLA) might be accessed more frequently than others (e.g., XYZ). In this case, the LRU algorithm can keep frequently accessed data 'hot' in the cache. When cache becomes full, the algorithm will eliminate the 'coldest' data or the data that hasn't been accessed in a while.
To implement an LRU cache in Python, one can use an OrderedDict. This data structure maintains insertion order, which allows us to easily decide which item to remove when cache is full, i.e., the item at the top (first inserted) should be removed first. To make an item hot, we can move it to the end of the OrderedDict when accessed.
Let's look at some basic code that demonstrates how this might be done.
xxxxxxxxxx
if __name__ == '__main__':
from collections import OrderedDict
class LRUCache(OrderedDict):
def __init__(self, capacity):
self.capacity = capacity
def get(self, key):
if key not in self:
return -1
self.move_to_end(key)
return self[key]
def put(self, key, value):
if key in self:
self.move_to_end(key)
self[key] = value
if len(self) > self.capacity:
self.popitem(last=False)
cache = LRUCache(2)
cache.put('AAPL', 120)
cache.put('TSLA', 600)
print(cache.get('AAPL')) # output: 120
cache.put('XYZ', 20) # this will remove 'TSLA' from the cache as it's the least recently used.
print(cache.get('TSLA')) # output: -1