Skip to main content

LRU Cache

How to combine a hash map with a doubly linked list for fast lookup and fast recency updates.

Implement an LRU cache with:

  • get(key): returns the value if the key exists, otherwise -1
  • put(key, value): inserts or updates the key

When capacity is exceeded, remove the least recently used key.

Both get and put must run in O(1).

In the original problem, this is usually a class. In the SeniorPath playground, use:

  • operations: the list of operations
  • values: the arguments for each operation

And return an array with the results. Operations that do not return a value should return null.

Example:

Input:
  operations = ["LRUCache","put","put","get","put","get","put","get","get","get"]
  values     = [[2],[1,1],[2,2],[1],[3,3],[2],[4,4],[1],[3],[4]]

Output:
  [null,null,null,1,null,-1,null,-1,3,4]

What to notice before coding

Common mistakes

Step 1 - Understand the two requirements at the same time

The cache needs to do two things:

  • find a key quickly
  • reorder quickly when a key is used

That "at the same time" part is the whole problem.

Step 2 - Write the most direct baseline

The natural baseline is:

  • keep [key, value] pairs in an array by usage order
  • use findIndex
  • move to the end when used
  • remove from the front when full

It works.

But findIndex, splice, and shift are O(n).

Step 3 - Ask why one structure is not enough

A hash map solves lookup by key.

But by itself it does not know how to remove or move an arbitrary item in usage order.

A doubly linked list is good at order operations:

  • remove a node from the middle
  • append a node at the end
  • remove the oldest node from the front

All in O(1), if you already have the node.

Step 4 - Combine the two responsibilities

The strong design is:

  • Map<key, node> for lookup
  • a doubly linked list for recency order

The map answers "where is this key?" The list answers "who is least recent and who is most recent?"

The interactive editor needs JavaScript. You can still read the prompt and copy the starter code below.

Starter code

export function runLRUCache(operations, values) {
  // operations example: ["LRUCache", "put", "get"]
  // values example: [[2], [1, 10], [1]]
  // your solution here
  return []
}
solution.ts
Stuck? Hints available.

Haven't solved it yet?

Viewing the solution now may reduce learning. It's worth trying a bit more.

Open the reference solution

Without JavaScript, the reference solution is shown inline instead of in a dialog.

Solution

Final complexity

Time: O(1)

Space: O(capacity)

Solution 1 - Array keeping usage order

Good baseline because it makes the linear bottleneck obvious.

export function runLRUCache(operations, values) {
  let capacity = 0
  const entries = []
  const output = []

  for (let index = 0; index < operations.length; index += 1) {
    const operation = operations[index]
    const args = values[index]

    if (operation === "LRUCache") {
      capacity = args[0]
      output.push(null)
      continue
    }

    if (operation === "get") {
      const key = args[0]
      const foundIndex = entries.findIndex(([entryKey]) => entryKey === key)

      if (foundIndex === -1) {
        output.push(-1)
        continue
      }

      const [entry] = entries.splice(foundIndex, 1)
      entries.push(entry)
      output.push(entry[1])
      continue
    }

    const [key, value] = args
    const foundIndex = entries.findIndex(([entryKey]) => entryKey === key)

    if (foundIndex !== -1) {
      entries.splice(foundIndex, 1)
    } else if (entries.length === capacity) {
      entries.shift()
    }

    entries.push([key, value])
    output.push(null)
  }

  return output
}

It works, but it does not meet the requested O(1).

Solution 2 - Hash map plus doubly linked list

Now each structure handles the part it is good at.

export function runLRUCache(operations, values) {
  let capacity = 0
  const nodes = new Map<number, any>()
  const output = []

  const head = { key: 0, value: 0, prev: null, next: null }
  const tail = { key: 0, value: 0, prev: null, next: null }
  head.next = tail
  tail.prev = head

  function remove(node) {
    node.prev.next = node.next
    node.next.prev = node.prev
  }

  function append(node) {
    node.prev = tail.prev
    node.next = tail
    tail.prev.next = node
    tail.prev = node
  }

  for (let index = 0; index < operations.length; index += 1) {
    const operation = operations[index]
    const args = values[index]

    if (operation === "LRUCache") {
      capacity = args[0]
      output.push(null)
      continue
    }

    if (operation === "get") {
      const key = args[0]
      const node = nodes.get(key)

      if (!node) {
        output.push(-1)
        continue
      }

      remove(node)
      append(node)
      output.push(node.value)
      continue
    }

    const [key, value] = args

    if (nodes.has(key)) {
      const existing = nodes.get(key)
      existing.value = value
      remove(existing)
      append(existing)
      output.push(null)
      continue
    }

    const node = { key, value, prev: null, next: null }
    nodes.set(key, node)
    append(node)

    if (nodes.size > capacity) {
      const leastRecent = head.next
      remove(leastRecent)
      nodes.delete(leastRecent.key)
    }

    output.push(null)
  }

  return output
}

The important invariants are:

  • head.next is always the least recent
  • tail.prev is always the most recent
  • the map always points to the current node for each key

What to say in the interview

A strong short explanation would be:

The problem needs O(1) lookup by key and O(1) recency updates. An array cannot really do both. So I split responsibilities: a hash map to find the node for a key, and a doubly linked list to remove a node, append it to the end, and evict the least recent one without scanning.

That shows that you:

  • understand why the baseline fails
  • can justify the structure composition
  • talk in invariants instead of only pointer operations

When this pattern shows up again

This pattern comes back when you need to:

  • combine fast lookup with mutable ordering
  • design caches or queues with recency or priority rules
  • think in structural invariants, not only isolated algorithms

The point is not memorizing LRU Cache.

The point is learning when a problem asks for structure composition.

Next challenge Course Schedule Previous challenge Word Search

Related challenges

Related articles