LRU Cache

Hard
Concept
Code
Loading editor...
Tap Analyze to see visualization
Variables

Run code to see variables

Output

Console output will appear here

Press Space to start to step? all shortcuts

Solution Guide: LRU Cache

Approach

Combine a doubly linked list (for O(1) insert/remove/reorder) with a plain object hash map (for O(1) key lookup). The list maintains access order: most recently used at the front, least recently used at the back. Sentinel head/tail nodes eliminate null-check edge cases.

Complexity Analysis

Time
O(1) per get/put
Space
O(capacity)

Pattern

Linked List (Doubly LL + Hash Map)

Why It Works

The hash map gives O(1) access to any node by key. The doubly linked list gives O(1) removal and insertion at any position. Together they maintain LRU order: every access moves the node to the front, and eviction always removes from the back.

Updated Feb 2026