
Sponsored
Sponsored
This approach combines a Doubly Linked List with a HashMap to maintain the LRU functionality. The doubly linked list keeps track of the order of elements. The head of the list represents the most recently used item, whereas the tail represents the least recently used item. Through a HashMap, we achieve O(1) access time complexity for updating and retrieving values. On cache misses that result in a new insertion when the cache is full, we evict the last node of the list.
Time Complexity: O(1) for both get and put operations due to the hashmap and linked list. Space Complexity: O(n), where n is the capacity of the cache for the hashmap and linked list.
1import java.util.*;
2
3class LRUCache {
4 class Node {
5 int key, value;
6 Node prev, next;
7 Node(int k, int v) {
8 key = k;
9 value = v;
10 }
11 }
12 private final int capacity;
13 private Map<Integer, Node> cache;
14 private Node head, tail;
15
16 public LRUCache(int capacity) {
17 this.capacity = capacity;
18 cache = new HashMap<>();
19 head = new Node(0, 0);
20 tail = new Node(0, 0);
21 head.next = tail;
22 tail.prev = head;
23 }
24
25 private void remove(Node node) {
26 Node prev = node.prev;
27 Node next = node.next;
28 prev.next = next;
29 next.prev = prev;
30 }
31
32 private void addToHead(Node node) {
33 node.next = head.next;
34 node.prev = head;
35 head.next.prev = node;
36 head.next = node;
37 }
38
39 public int get(int key) {
40 if (cache.containsKey(key)) {
41 Node node = cache.get(key);
42 remove(node);
43 addToHead(node);
44 return node.value;
45 }
46 return -1;
47 }
48
49 public void put(int key, int value) {
50 if (cache.containsKey(key)) {
51 remove(cache.get(key));
52 }
53 if (cache.size() == capacity) {
54 cache.remove(tail.prev.key);
55 remove(tail.prev);
56 }
57 Node node = new Node(key, value);
58 addToHead(node);
59 cache.put(key, node);
60 }
61}The Java solution uses a nested Node class to encapsulate each cache entry. The LRUCache class manages the functionality by using a HashMap and Doubly Linked List (using dummy head and tail nodes) for constant time complexity in operations.
This approach leverages built-in ordered dictionary data structures available in some languages. These structures inherently maintain the order of insertion, allowing for easy eviction of the least recently used item.
Time Complexity: O(1) for get and put since OrderedDict operations are optimized for simplicity and efficiency. Space Complexity: O(n), with n being the capacity of the cache.
1from collections import OrderedDict
2
3class LRUCache
This Python solution uses the OrderedDict from the collections module. The inherent order tracking of OrderedDict facilitates moving accessed elements to the front or end of the dictionary easily, simplifying the logic for keeping track of least recently used items.