1. Brute Force

Intuition

We store all (key, value) pairs in a list.
To follow LRU (Least Recently Used) behavior:

  • Whenever we access a key, we move it to the end of the list (most recently used).
  • When inserting a new key:
    • If the key already exists → update its value and move it to the end.
    • If the cache is full → remove the first element (least recently used).
    • Then add the new key at the end.

Algorithm

  1. Initialization

    • Store the cache as a list.
    • Store the capacity.
  2. get(key)

    • Loop through the list to find the key.
    • If found:
      • Remove it from its current position.
      • Append it to the end (mark as recently used).
      • Return its value.
    • If not found, return -1.
  3. put(key, value)

    • Look for the key in the list.
      • If found:
        • Remove it.
        • Update the value.
        • Append it to the end.
        • Return.
    • If not found:
      • If the cache is full, remove the first element.
      • Append [key, value] to the end.
class LRUCache:

    def __init__(self, capacity: int):
        self.cache = []
        self.capacity = capacity

    def get(self, key: int) -> int:
        for i in range(len(self.cache)):
            if self.cache[i][0] == key:
                tmp = self.cache.pop(i)
                self.cache.append(tmp)
                return tmp[1]
        return -1

    def put(self, key: int, value: int) -> None:
        for i in range(len(self.cache)):
            if self.cache[i][0] == key:
                tmp = self.cache.pop(i)
                tmp[1] = value
                self.cache.append(tmp)
                return

        if self.capacity == len(self.cache):
            self.cache.pop(0)

        self.cache.append([key, value])

Time & Space Complexity

  • Time complexity: O(n)O(n) for each put()put() and get()get() operation.
  • Space complexity: O(n)O(n)

2. Doubly Linked List

Intuition

We want all operations to be O(1) while still following LRU (Least Recently Used) rules.

To do that, we combine:

  1. Hash Map → quickly find a node by its key in O(1).
  2. Doubly Linked List → quickly move nodes to the most recently used position and remove the least recently used node from the other end in O(1).

We keep:

  • The most recently used node near the right side.
  • The least recently used node near the left side.

Whenever we:

  • Get a key: move that node to the right (most recently used).
  • Put a key:
    • If it exists: update value and move it to the right.
    • If it’s new:
      • If at capacity: remove the leftmost real node (LRU).
      • Insert the new node at the right.

Dummy left and right nodes make insert/remove logic cleaner.


Algorithm

  1. Data Structures

    • A hash map cache that maps key → node.
    • A doubly linked list with:
      • left dummy: before the least recently used node.
      • right dummy: after the most recently used node.
  2. Helper: remove(node)

    • Unlink node from the list by connecting its prev and next nodes.
  3. Helper: insert(node)

    • Insert node just before right (mark as most recently used).
  4. get(key)

    • If key not in cache, return -1.
    • Otherwise:
      • Remove its node from the list.
      • Insert it again near right (mark as recently used).
      • Return the node’s value.
  5. put(key, value)

    • If key already exists:
      • Remove its old node from the list.
    • Create or update the node and store it in cache[key].
    • Insert the node near right.
    • If len(cache) > capacity:
      • Take the node right after left (this is LRU).
      • Remove it from the list.
      • Delete its key from the hash map.

This way, both get and put run in O(1) time, and the LRU policy is always maintained.

class Node:
    def __init__(self, key, val):
        self.key, self.val = key, val
        self.prev = self.next = None

class LRUCache:
    def __init__(self, capacity: int):
        self.cap = capacity
        self.cache = {}  # map key to node

        self.left, self.right = Node(0, 0), Node(0, 0)
        self.left.next, self.right.prev = self.right, self.left

    def remove(self, node):
        prev, nxt = node.prev, node.next
        prev.next, nxt.prev = nxt, prev

    def insert(self, node):
        prev, nxt = self.right.prev, self.right
        prev.next = nxt.prev = node
        node.next, node.prev = nxt, prev

    def get(self, key: int) -> int:
        if key in self.cache:
            self.remove(self.cache[key])
            self.insert(self.cache[key])
            return self.cache[key].val
        return -1

    def put(self, key: int, value: int) -> None:
        if key in self.cache:
            self.remove(self.cache[key])
        self.cache[key] = Node(key, value)
        self.insert(self.cache[key])

        if len(self.cache) > self.cap:
            lru = self.left.next
            self.remove(lru)
            del self.cache[lru.key]

Time & Space Complexity

  • Time complexity: O(1)O(1) for each put()put() and get()get() operation.
  • Space complexity: O(n)O(n)

3. Built-In Data Structure

Intuition

Many languages provide a built-in ordered map / dictionary that:

  • Stores key–value pairs
  • Keeps track of the order in which keys were inserted or recently updated

This is perfect for an LRU cache:

  • When we access a key (get or put), we mark it as most recently used by moving it to the “end” of the order.
  • When the cache exceeds capacity, we remove the key that is at the front of the order (the least recently used).

So the ordered map itself handles:

  • Fast lookups (like a normal hash map)
  • Fast updates of usage order (moving keys to the end)
  • Fast removal of the least recently used key (from the front)

This gives a clean and concise LRU implementation using library support.


Algorithm

  1. Initialization

    • Create an ordered map cache.
    • Store the maximum capacity cap.
  2. get(key)

    • If key is not in cache, return -1.
    • Otherwise:
      • Move key to the “most recent” position in the ordered map.
      • Return the associated value.
  3. put(key, value)

    • If key is already in cache:
      • Update its value.
      • Move key to the “most recent” position.
    • If key is not in cache:
      • Insert (key, value) into cache at the “most recent” position.
    • If the size of cache is now greater than cap:
      • Remove the least recently used key (the one at the “oldest” position in the ordered map).

This uses the built-in ordered map to achieve LRU behavior with O(1) average time for both get and put.

class LRUCache:

    def __init__(self, capacity: int):
        self.cache = OrderedDict()
        self.cap = capacity

    def get(self, key: int) -> int:
        if key not in self.cache:
            return -1
        self.cache.move_to_end(key)
        return self.cache[key]

    def put(self, key: int, value: int) -> None:
        if key in self.cache:
            self.cache.move_to_end(key)
        self.cache[key] = value

        if len(self.cache) > self.cap:
            self.cache.popitem(last=False)

Time & Space Complexity

  • Time complexity: O(1)O(1) for each put()put() and get()get() operation.
  • Space complexity: O(n)O(n)