Design a data structure that follows the constraints of a Least Recently Used (LRU) cache.
Implement the LRUCache class:
LRUCache(int capacity) Initialize the LRU cache with positive size capacity.int get(int key) Return the value of the key if the key exists, otherwise return -1.void put(int key, int value) Update the value of the key if the key exists. Otherwise, add the key-value pair to the cache. If the number of keys exceeds the capacity from this operation, evict the least recently used key.The functions get and put must each run in O(1) average time complexity.
Example 1:
Input
["LRUCache", "put", "put", "get", "put", "get", "put", "get", "get", "get"]
[[2], [1, 1], [2, 2], [1], [3, 3], [2], [4, 4], [1], [3], [4]]
Output
[null, null, null, 1, null, -1, null, -1, 3, 4]
Explanation
LRUCache lRUCache = new LRUCache(2);
lRUCache.put(1, 1); // cache is {1=1}
lRUCache.put(2, 2); // cache is {1=1, 2=2}
lRUCache.get(1); // return 1
lRUCache.put(3, 3); // LRU key was 2, evicts key 2, cache is {1=1, 3=3}
lRUCache.get(2); // returns -1 (not found)
lRUCache.put(4, 4); // LRU key was 1, evicts key 1, cache is {4=4, 3=3}
lRUCache.get(1); // return -1 (not found)
lRUCache.get(3); // return 3
lRUCache.get(4); // return 4
Constraints:
1 <= capacity <= 30000 <= key <= 1040 <= value <= 1052 * 105 calls will be made to get and put.Problem Overview: Design a data structure that behaves like a Least Recently Used (LRU) cache. The cache has a fixed capacity and supports two operations: get(key) and put(key, value). When the cache exceeds capacity, it must evict the least recently used item. Both operations must run in O(1) time.
Approach 1: Doubly Linked List with HashMap (O(1) time, O(n) space)
This is the standard interview solution. Combine a hash table with a doubly linked list. The hash map stores key → node so you can locate nodes instantly. The doubly linked list maintains usage order: the most recently used node sits near the head, and the least recently used node sits near the tail. On every get, move the accessed node to the front. On put, either update the existing node or insert a new node at the front. If capacity is exceeded, remove the node at the tail. Each operation performs constant-time pointer updates and hash lookups, giving O(1) time per operation and O(n) space for storing up to capacity nodes.
Approach 2: Ordered Dictionary / Built-in Ordered Map (O(1) time, O(n) space)
Some languages provide a library structure that already maintains insertion order. Python’s OrderedDict or modern dictionary methods, and JavaScript’s Map, allow you to reorder keys efficiently. The idea is simple: when a key is accessed, remove it and reinsert it so it becomes the most recent entry. When inserting a new key that exceeds capacity, remove the first element (the least recently used). Each lookup and update still runs in O(1) average time due to the underlying hash table. Space usage remains O(n) because the structure stores up to capacity key-value pairs.
Recommended for interviews: Interviewers typically expect the Doubly Linked List + HashMap design. It demonstrates strong understanding of system design style problems, pointer manipulation, and constant-time eviction strategies. The library-based solution works in production code but hides the core idea. Showing the manual design proves you understand how LRU eviction actually works under the hood.
This approach combines a Doubly Linked List with a HashMap to maintain the LRU functionality. The doubly linked list keeps track of the order of elements. The head of the list represents the most recently used item, whereas the tail represents the least recently used item. Through a HashMap, we achieve O(1) access time complexity for updating and retrieving values. On cache misses that result in a new insertion when the cache is full, we evict the last node of the list.
The Python solution defines a node class to handle key-value pairs and handles the LRU logic in the LRUCache class. The nodes are stored in a doubly linked list, allowing quick updates and removals. The cache (dictionary) maps keys to their nodes, providing O(1) time complexity for get and put operations.
Time Complexity: O(1) for both get and put operations due to the hashmap and linked list. Space Complexity: O(n), where n is the capacity of the cache for the hashmap and linked list.
This approach leverages built-in ordered dictionary data structures available in some languages. These structures inherently maintain the order of insertion, allowing for easy eviction of the least recently used item.
This Python solution uses the OrderedDict from the collections module. The inherent order tracking of OrderedDict facilitates moving accessed elements to the front or end of the dictionary easily, simplifying the logic for keeping track of least recently used items.
Python
JavaScript
Time Complexity: O(1) for get and put since OrderedDict operations are optimized for simplicity and efficiency. Space Complexity: O(n), with n being the capacity of the cache.
| Approach | Complexity |
|---|---|
| Doubly Linked List with HashMap | Time Complexity: O(1) for both |
| Ordered Dictionary (Library Specific) | Time Complexity: O(1) for |
| Approach | Time | Space | When to Use |
|---|---|---|---|
| Doubly Linked List + HashMap | O(1) per get/put | O(n) | Standard interview solution when constant-time eviction is required |
| Ordered Dictionary / Map | O(1) average | O(n) | When language libraries provide ordered maps and implementation simplicity matters |
LRU Cache - Twitch Interview Question - Leetcode 146 • NeetCode • 487,590 views views
Watch 9 more video solutions →Practice LRU Cache with our built-in code editor and test cases.
Practice on FleetCodePractice this problem
Open in Editor