Given a function fn, return a memoized version of that function.
A memoized function is a function that will never be called twice with the same inputs. Instead it will return a cached value.
You can assume there are 3 possible input functions: sum, fib, and factorial.
sum accepts two integers a and b and returns a + b. Assume that if a value has already been cached for the arguments (b, a) where a != b, it cannot be used for the arguments (a, b). For example, if the arguments are (3, 2) and (2, 3), two separate calls should be made.fib accepts a single integer n and returns 1 if n <= 1 or fib(n - 1) + fib(n - 2) otherwise.factorial accepts a single integer n and returns 1 if n <= 1 or factorial(n - 1) * n otherwise.
Example 1:
Input: fnName = "sum" actions = ["call","call","getCallCount","call","getCallCount"] values = [[2,2],[2,2],[],[1,2],[]] Output: [4,4,1,3,2] Explanation: const sum = (a, b) => a + b; const memoizedSum = memoize(sum); memoizedSum(2, 2); // "call" - returns 4. sum() was called as (2, 2) was not seen before. memoizedSum(2, 2); // "call" - returns 4. However sum() was not called because the same inputs were seen before. // "getCallCount" - total call count: 1 memoizedSum(1, 2); // "call" - returns 3. sum() was called as (1, 2) was not seen before. // "getCallCount" - total call count: 2
Example 2:
Input: fnName = "factorial" actions = ["call","call","call","getCallCount","call","getCallCount"] values = [[2],[3],[2],[],[3],[]] Output: [2,6,2,2,6,2] Explanation: const factorial = (n) => (n <= 1) ? 1 : (n * factorial(n - 1)); const memoFactorial = memoize(factorial); memoFactorial(2); // "call" - returns 2. memoFactorial(3); // "call" - returns 6. memoFactorial(2); // "call" - returns 2. However factorial was not called because 2 was seen before. // "getCallCount" - total call count: 2 memoFactorial(3); // "call" - returns 6. However factorial was not called because 3 was seen before. // "getCallCount" - total call count: 2
Example 3:
Input: fnName = "fib" actions = ["call","getCallCount"] values = [[5],[]] Output: [8,1] Explanation: fib(5) = 8 // "call" // "getCallCount" - total call count: 1
Constraints:
0 <= a, b <= 1051 <= n <= 101 <= actions.length <= 105actions.length === values.lengthactions[i] is one of "call" and "getCallCount"fnName is one of "sum", "factorial" and "fib"Problem Overview: Design a memoize function that wraps another function and caches results for previously seen inputs. When the memoized function is called again with the same arguments, it should return the cached result instead of recomputing it.
Approach 1: Using a Simple Map for Caching (Average O(1) time, O(n) space)
This approach stores previously computed results inside a hash map (or Map/HashMap). Each function call generates a key representing the arguments, typically by converting the argument list into a string. When the memoized function runs, you first perform a hash lookup to check whether the key already exists. If it does, return the cached value immediately; otherwise compute the result, store it in the map, and return it. This approach works well in JavaScript, Python, and Java where dictionary-like structures provide average O(1) lookup and insertion. The total space grows to O(n) where n is the number of unique input combinations.
Approach 2: Storing Inputs as Hashable Tuples (Average O(1) time, O(n) space)
Instead of serializing arguments into a string, store them directly as a hashable tuple. Languages like Python and C# allow tuples to be used as dictionary keys because they implement hashing based on their contents. Each function call creates a tuple of the arguments and performs a dictionary lookup. If the tuple already exists, return the stored value; otherwise evaluate the original function and cache the result. This avoids the overhead and potential collisions of string serialization while keeping constant-time lookups. The dictionary still grows proportionally with unique argument combinations, so space complexity remains O(n).
Recommended for interviews: The expected solution uses a hash-based cache with constant-time lookup. Interviewers want to see that you recognize memoization as a caching technique built on top of a hash map. A straightforward map-based implementation demonstrates understanding of function wrappers and argument handling. The tuple-key approach is cleaner in languages that support hashable composite keys and often appears in discussions of dynamic programming and caching optimizations.
In this approach, we can utilize a dictionary (or map) to cache the results of the function calls. The keys in this dictionary will be the string representation of the input arguments, and the values will be the corresponding function output. We will also maintain a count of how many unique calls have been made with new inputs.
The Memoize class is defined to wrap the target function. It maintains a cache dictionary and a call count integer. If the input arguments are already present in the cache, the cached result is returned. Otherwise, the function is called, the result is cached, and the call count is incremented.
Python
JavaScript
Java
The time complexity for each call is O(1) due to dictionary lookups. The space complexity is O(n) in the worst case, where n is the number of unique calls made, due to storage in the cache.
Preserving the order and distinction of each argument set is crucial and can be achieved using tuples for arguments. With languages like Python, tuples can directly represent immutable and hashable keys for our cache. Similar adaptations exist for other languages, respecting each language's constraints around tuple-like structures and hashable types.
This approach defines the key for cache lookups as a tuple of the input arguments, leveraging Python's native tuple hashability to allow efficient cache operations.
O(1) lookup time in the cache; Space complexity is O(n), dependent on the number of distinct calls stored.
TypeScript
| Approach | Complexity |
|---|---|
| Approach 1: Using a Simple Map for Caching | The time complexity for each call is O(1) due to dictionary lookups. The space complexity is O(n) in the worst case, where n is the number of unique calls made, due to storage in the cache. |
| Approach 2: Storing Inputs as Hashable Tuples | O(1) lookup time in the cache; Space complexity is O(n), dependent on the number of distinct calls stored. |
| Default Approach | — |
| Approach | Time | Space | When to Use |
|---|---|---|---|
| Simple Map Caching | O(1) average per call | O(n) | General-purpose memoization when arguments can be serialized into a unique key |
| Hashable Tuple Keys | O(1) average per call | O(n) | Languages with native tuple hashing like Python or C#, avoids string serialization overhead |
Memoize - Leetcode 2623 - JavaScript 30-Day Challenge • NeetCodeIO • 18,582 views views
Watch 9 more video solutions →Practice Memoize with our built-in code editor and test cases.
Practice on FleetCode