Design a data structure that follows the constraints of a Least Recently Used (LRU) cache.
Implement the LRUCache class:
LRUCache(int capacity) Initialize the LRU cache with positive size capacity.int get(int key) Return the value of the key if the key exists, otherwise return -1.void put(int key, int value) Update the value of the key if the key exists. Otherwise, add the key-value pair to the cache. If the number of keys exceeds the capacity from this operation, evict the least recently used key.The functions get and put must each run in O(1) average time complexity.
Example 1:
Input
["LRUCache", "put", "put", "get", "put", "get", "put", "get", "get", "get"]
[[2], [1, 1], [2, 2], [1], [3, 3], [2], [4, 4], [1], [3], [4]]
Output
[null, null, null, 1, null, -1, null, -1, 3, 4]
Explanation
LRUCache lRUCache = new LRUCache(2);
lRUCache.put(1, 1); // cache is {1=1}
lRUCache.put(2, 2); // cache is {1=1, 2=2}
lRUCache.get(1); // return 1
lRUCache.put(3, 3); // LRU key was 2, evicts key 2, cache is {1=1, 3=3}
lRUCache.get(2); // returns -1 (not found)
lRUCache.put(4, 4); // LRU key was 1, evicts key 1, cache is {4=4, 3=3}
lRUCache.get(1); // return -1 (not found)
lRUCache.get(3); // return 3
lRUCache.get(4); // return 4
Constraints:
1 <= capacity <= 30000 <= key <= 1040 <= value <= 1052 * 105 calls will be made to get and put.The key challenge in #146 LRU Cache is designing a data structure that supports get and put operations in constant time while always removing the least recently used item when capacity is exceeded.
The optimal approach combines a Hash Table with a Doubly Linked List. The hash map provides O(1) access to cache nodes by key, while the doubly linked list maintains the order of usage. Whenever a key is accessed or inserted, its node is moved to the front of the list to mark it as recently used. The least recently used element is kept near the tail, allowing quick eviction when the cache exceeds capacity.
This design ensures that both lookup and update operations remain efficient while maintaining the usage order dynamically. The overall complexity for both get and put operations is O(1), making it suitable for high-performance caching systems.
| Approach | Time Complexity | Space Complexity |
|---|---|---|
| Hash Map + Doubly Linked List | O(1) for get and put | O(n) |
NeetCode
This approach combines a Doubly Linked List with a HashMap to maintain the LRU functionality. The doubly linked list keeps track of the order of elements. The head of the list represents the most recently used item, whereas the tail represents the least recently used item. Through a HashMap, we achieve O(1) access time complexity for updating and retrieving values. On cache misses that result in a new insertion when the cache is full, we evict the last node of the list.
Time Complexity: O(1) for both get and put operations due to the hashmap and linked list. Space Complexity: O(n), where n is the capacity of the cache for the hashmap and linked list.
1class Node:
2 def __init__(self, key, value):
3 self.key = key
4 self.value = value
5 self.prev = None
6 self.next = None
7
8class LRUCache:
9 def __init__(self, capacity: int):
10 self.capacity = capacity
11 self.cache = {}
12 self.head = Node(0, 0)
13 self.tail = Node(0, 0)
14 self.head.next = self.tail
15 self.tail.prev = self.head
16
17 def _remove(self, node):
18 prev = node.prev
19 next = node.next
20 prev.next, next.prev = next, prev
21
22 def _add(self, node):
23 n = self.head.next
24 self.head.next = node
25 node.prev = self.head
26 node.next = n
27 n.prev = node
28
29 def get(self, key: int) -> int:
30 if key in self.cache:
31 node = self.cache[key]
32 self._remove(node)
33 self._add(node)
34 return node.value
35 return -1
36
37 def put(self, key: int, value: int) -> None:
38 if key in self.cache:
39 self._remove(self.cache[key])
40 node = Node(key, value)
41 self._add(node)
42 self.cache[key] = node
43 if len(self.cache) > self.capacity:
44 lru = self.tail.prev
45 self._remove(lru)
46 del self.cache[lru.key]The Python solution defines a node class to handle key-value pairs and handles the LRU logic in the LRUCache class. The nodes are stored in a doubly linked list, allowing quick updates and removals. The cache (dictionary) maps keys to their nodes, providing O(1) time complexity for get and put operations.
This approach leverages built-in ordered dictionary data structures available in some languages. These structures inherently maintain the order of insertion, allowing for easy eviction of the least recently used item.
Time Complexity: O(1) for get and put since OrderedDict operations are optimized for simplicity and efficiency. Space Complexity: O(n), with n being the capacity of the cache.
1class LRUCache {
2 constructor(capacity)
Watch expert explanations and walkthroughs
Practice problems asked by these companies to ace your technical interviews.
Explore More ProblemsJot down your thoughts, approach, and key learnings
Yes, LRU Cache is a very common design question in FAANG and other top tech company interviews. It tests understanding of data structures, system design thinking, and the ability to combine multiple structures for optimal performance.
The optimal approach combines a hash map with a doubly linked list. The hash map allows O(1) access to nodes by key, while the doubly linked list maintains the order of recently used elements. This design ensures both get and put operations run in constant time.
A doubly linked list allows efficient insertion, deletion, and movement of nodes in O(1) time. This is important for quickly updating the position of recently accessed items and removing the least recently used node from the tail.
A combination of a hash table and a doubly linked list is considered the best data structure for LRU Cache. The hash table provides fast lookups, while the doubly linked list efficiently updates the order of recently used items.
This JavaScript solution utilizes the Map object. Map maintains the order of entries, enabling the quick updating and eviction needed for the LRU cache. Each get operation involves removing and reinserting the accessed element to retain ordering.