Implement the Least Recently Used (LRU) cache class LRUCache. The class should support the following operations
LRUCache(int capacity) Initialize the LRU cache of size capacity.int get(int key) Return the value corresponding to the key if the key exists, otherwise return -1.void put(int key, int value) Update the value of the key if the key exists. Otherwise, add the key-value pair to the cache. If the introduction of the new pair causes the cache to exceed its capacity, remove the least recently used key.A key is considered used if a get or a put operation is called on it.
Ensure that get and put each run in average time complexity.
Example 1:
Input:
["LRUCache", [2], "put", [1, 10], "get", [1], "put", [2, 20], "put", [3, 30], "get", [2], "get", [1]]
Output:
[null, null, 10, null, null, 20, -1]
Explanation:
LRUCache lRUCache = new LRUCache(2);
lRUCache.put(1, 10); // cache: {1=10}
lRUCache.get(1); // return 10
lRUCache.put(2, 20); // cache: {1=10, 2=20}
lRUCache.put(3, 30); // cache: {2=20, 3=30}, key=1 was evicted
lRUCache.get(2); // returns 20
lRUCache.get(1); // return -1 (not found)Constraints:
1 <= capacity <= 1000 <= key <= 10000 <= value <= 1000
You should aim for a solution with O(1) time for each put() and get() function call and an overall space of O(n), where n is the capacity of the LRU cache.
Can you think of a data structure for storing data in key-value pairs? Maybe a hash-based data structure with unique keys.
We can use a hash map which takes O(1) time to get and put the values. But, how can you deal with the least recently used to be removed criteria as a key is updated by the put() or recently used by the get() functions? Can you think of a data structure to store the order of values?
A brute-force solution would involve maintaining the order of key-value pairs in an array list, performing operations by iterating through the list to erase and insert these key-value pairs. However, this would result in an O(n) time complexity. Can you think of a data structure that allows removing and reinserting elements in O(1) time?
We can use a doubly-linked list, which allows us to remove a node from the list when we have the address of that node. Can you think of a way to store these addresses so that we can efficiently remove or update a key when needed?
We can use a doubly linked list where key-value pairs are stored as nodes, with the least recently used (LRU) node at the head and the most recently used (MRU) node at the tail. Whenever a key is accessed using get() or put(), we remove the corresponding node and reinsert it at the tail. When the cache reaches its capacity, we remove the LRU node from the head of the list. Additionally, we use a hash map to store each key and the corresponding address of its node, enabling efficient operations in O(1) time.
Before attempting this problem, you should be comfortable with:
We store all (key, value) pairs in a list.
To follow LRU (Least Recently Used) behavior:
Initialization
get(key)
-1.put(key, value)
[key, value] to the end.class LRUCache:
def __init__(self, capacity: int):
self.cache = []
self.capacity = capacity
def get(self, key: int) -> int:
for i in range(len(self.cache)):
if self.cache[i][0] == key:
tmp = self.cache.pop(i)
self.cache.append(tmp)
return tmp[1]
return -1
def put(self, key: int, value: int) -> None:
for i in range(len(self.cache)):
if self.cache[i][0] == key:
tmp = self.cache.pop(i)
tmp[1] = value
self.cache.append(tmp)
return
if self.capacity == len(self.cache):
self.cache.pop(0)
self.cache.append([key, value])We want all operations to be O(1) while still following LRU (Least Recently Used) rules.
To do that, we combine:
We keep:
right side.left side.Whenever we:
right (most recently used).right.right.Dummy left and right nodes make insert/remove logic cleaner.
Data Structures
cache that maps key -> node.left dummy: before the least recently used node.right dummy: after the most recently used node.Helper: remove(node)
node from the list by connecting its prev and next nodes.Helper: insert(node)
node just before right (mark as most recently used).get(key)
key not in cache, return -1.right (mark as recently used).put(key, value)
key already exists:cache[key].right.len(cache) > capacity:left (this is LRU).This way, both get and put run in O(1) time, and the LRU policy is always maintained.
class Node:
def __init__(self, key, val):
self.key, self.val = key, val
self.prev = self.next = None
class LRUCache:
def __init__(self, capacity: int):
self.cap = capacity
self.cache = {} # map key to node
self.left, self.right = Node(0, 0), Node(0, 0)
self.left.next, self.right.prev = self.right, self.left
def remove(self, node):
prev, nxt = node.prev, node.next
prev.next, nxt.prev = nxt, prev
def insert(self, node):
prev, nxt = self.right.prev, self.right
prev.next = nxt.prev = node
node.next, node.prev = nxt, prev
def get(self, key: int) -> int:
if key in self.cache:
self.remove(self.cache[key])
self.insert(self.cache[key])
return self.cache[key].val
return -1
def put(self, key: int, value: int) -> None:
if key in self.cache:
self.remove(self.cache[key])
self.cache[key] = Node(key, value)
self.insert(self.cache[key])
if len(self.cache) > self.cap:
lru = self.left.next
self.remove(lru)
del self.cache[lru.key]Many languages provide a built-in ordered map / dictionary that:
This is perfect for an LRU cache:
get or put), we mark it as most recently used by moving it to the "end" of the order.So the ordered map itself handles:
This gives a clean and concise LRU implementation using library support.
Initialization
cache.cap.get(key)
key is not in cache, return -1.key to the "most recent" position in the ordered map.put(key, value)
key is already in cache:key to the "most recent" position.key is not in cache:(key, value) into cache at the "most recent" position.cache is now greater than cap:This uses the built-in ordered map to achieve LRU behavior with O(1) average time for both get and put.
class LRUCache:
def __init__(self, capacity: int):
self.cache = OrderedDict()
self.cap = capacity
def get(self, key: int) -> int:
if key not in self.cache:
return -1
self.cache.move_to_end(key)
return self.cache[key]
def put(self, key: int, value: int) -> None:
if key in self.cache:
self.cache.move_to_end(key)
self.cache[key] = value
if len(self.cache) > self.cap:
self.cache.popitem(last=False)A critical LRU requirement is that get() operations also update recency. Many implementations correctly update order on put() but forget that accessing a key via get() should also move it to the most recently used position. This breaks the LRU invariant and causes wrong evictions.
When implementing the doubly linked list approach, pointer manipulation errors are common. When removing a node, you must update both prev.next and next.prev. When inserting, you must update four pointers: the new node's prev and next, plus the adjacent nodes' pointers. Missing any of these updates corrupts the list structure.
When evicting the least recently used item, you need to remove it from both the linked list and the hash map. If your list nodes only store values (not keys), you cannot efficiently find and remove the corresponding hash map entry. Always store the key in each list node to enable O(1) eviction.