Given a function fn, return a memoized version of that function.
A memoized function is a function that will never be called twice with the same inputs. Instead it will return a cached value.
fn can be any function and there are no constraints on what type of values it accepts. Inputs are considered identical if they are === to each other.
Example 1:
Input:
getInputs = () => [[2,2],[2,2],[1,2]]
fn = function (a, b) { return a + b; }
Output: [{"val":4,"calls":1},{"val":4,"calls":1},{"val":3,"calls":2}]
Explanation:
const inputs = getInputs();
const memoized = memoize(fn);
for (const arr of inputs) {
memoized(...arr);
}
For the inputs of (2, 2): 2 + 2 = 4, and it required a call to fn().
For the inputs of (2, 2): 2 + 2 = 4, but those inputs were seen before so no call to fn() was required.
For the inputs of (1, 2): 1 + 2 = 3, and it required another call to fn() for a total of 2.
Example 2:
Input:
getInputs = () => [[{},{}],[{},{}],[{},{}]]
fn = function (a, b) { return ({...a, ...b}); }
Output: [{"val":{},"calls":1},{"val":{},"calls":2},{"val":{},"calls":3}]
Explanation:
Merging two empty objects will always result in an empty object. It may seem like there should only be 1 call to fn() because of cache-hits, however none of those objects are === to each other.
Example 3:
Input:
getInputs = () => { const o = {}; return [[o,o],[o,o],[o,o]]; }
fn = function (a, b) { return ({...a, ...b}); }
Output: [{"val":{},"calls":1},{"val":{},"calls":1},{"val":{},"calls":1}]
Explanation:
Merging two empty objects will always result in an empty object. The 2nd and 3rd third function calls result in a cache-hit. This is because every object passed in is identical.
Constraints:
1 <= inputs.length <= 1050 <= inputs.flat().length <= 105inputs[i][j] != NaNIn #2630 Memoize II, the goal is to implement a memoization utility that caches results of function calls based on their input arguments. The main challenge is handling different argument types, including primitives and objects, while ensuring correct identity-based comparisons.
A robust approach is to build a multi-level cache structure using Map and WeakMap. Each function argument corresponds to a level in a cache tree. Primitive values can be stored using Map, while objects and functions should use WeakMap to avoid memory leaks. As arguments are processed sequentially, you traverse or create nodes in this cache structure. Once all arguments are consumed, the final node stores the computed result.
This design effectively creates a trie-like structure keyed by argument references. It ensures that repeated calls with the same argument identities return cached results without recomputation.
Time Complexity: roughly proportional to the number of arguments per call. Space Complexity: depends on the number of unique argument combinations stored in the cache.
| Approach | Time Complexity | Space Complexity |
|---|---|---|
| Map/WeakMap Trie-based Memoization | O(k) per function call (k = number of arguments) | O(n * k) for n cached calls |
NeetCode
Use these hints if you're stuck. Try solving on your own first.
Just because JSON.stringify(obj1) === JSON.stringify(obj2), doesn't necessarily mean obj1 === obj2.
You could iterate over all previously passed inputs to check if there has been a match. However, that will be very slow.
Javascript Maps are a could way to associate arbitrary data.
Make a tree structure of Maps. The depth of the tree should match the number of input parameters.
This approach utilizes a hash (or map) to store computed results of the function, where the keys are generated based on the input parameters. By serializing the inputs into a unique key (e.g., using JSON), we can easily check whether a computation for the same inputs has been already done and retrieve the result from the cache if it has.
The time complexity of this solution is O(1) on average for each call due to hash map lookups. The space complexity is O(n) where n is the number of unique input sets to the function.
1def memoize(fn):
2 cache = {}
3 def memoized(*args):
4 key = tuple(args)
5 if key not in cache:
6 cache[key] = fn(*args)
7 return cache[key]
8 return memoizedThis Python solution uses a tuple of the arguments to uniquely identify each call to the function. The tuple is used as a key in a dictionary (hash map) to store results. If the inputs are repeated, the stored result is returned instead of calling the function again.
This approach is similar to hash-based memoization but emphasizes generating a custom key for complex input types that might not be directly hashable. This is particularly useful for composite objects or nested data structures where default serialization (as in JSON) is inadequate or inefficient.
The time complexity is O(1) on average for cache operations, assuming good hashCode implementations for objects. Space complexity is O(n) depending on unique input sets.
1import java.util.*;
2
Watch expert explanations and walkthroughs
Jot down your thoughts, approach, and key learnings
WeakMap is used for object or function arguments so that cached keys do not prevent garbage collection. This helps avoid memory leaks when memoizing functions that receive object references.
Memoization and caching utilities are common interview topics at FAANG-level companies. While this exact problem may not always appear, the concepts of Map-based caching, reference equality, and performance optimization are frequently tested.
The optimal approach is to create a multi-level caching structure using Map and WeakMap. Each argument represents a level in the cache tree, allowing the function to store results for unique argument combinations efficiently.
A combination of Map and WeakMap works best. Map handles primitive keys efficiently, while WeakMap safely stores object references without retaining them unnecessarily in memory.
The Java example uses a List to capture arguments, which makes it feasible to handle object arrays. This List is used as a key in a HashMap to store the function results, providing efficient lookup.