You are given 3 positive integers zero, one, and limit.
A binary array arr is called stable if:
arr is exactly zero.arr is exactly one.arr with a size greater than limit must contain both 0 and 1.Return the total number of stable binary arrays.
Since the answer may be very large, return it modulo 109 + 7.
Example 1:
Input: zero = 1, one = 1, limit = 2
Output: 2
Explanation:
The two possible stable binary arrays are [1,0] and [0,1].
Example 2:
Input: zero = 1, one = 2, limit = 1
Output: 1
Explanation:
The only possible stable binary array is [1,0,1].
Example 3:
Input: zero = 3, one = 3, limit = 2
Output: 14
Explanation:
All the possible stable binary arrays are [0,0,1,0,1,1], [0,0,1,1,0,1], [0,1,0,0,1,1], [0,1,0,1,0,1], [0,1,0,1,1,0], [0,1,1,0,0,1], [0,1,1,0,1,0], [1,0,0,1,0,1], [1,0,0,1,1,0], [1,0,1,0,0,1], [1,0,1,0,1,0], [1,0,1,1,0,0], [1,1,0,0,1,0], and [1,1,0,1,0,0].
Constraints:
1 <= zero, one, limit <= 1000Problem Overview: Count the number of binary arrays that use exactly zero zeros and one ones while ensuring no more than limit identical values appear consecutively. The result must include every valid arrangement that satisfies both the count constraint and the stability rule.
Approach 1: Hash Map Memoization (Top-Down DP) (Time: O(zero * one * limit), Space: O(zero * one * limit))
Model the problem as dynamic programming with memoization. Each state represents how many zeros and ones remain, what the last placed value was, and how long the current streak is. A recursive function explores placing either a 0 or 1 if doing so does not exceed the limit. Results for states like (z, o, last, streak) are cached in a hash map so repeated subproblems are computed once. This avoids exponential recursion and turns the search into a bounded DP over all valid states. The approach is straightforward to implement and clearly expresses the constraint logic, making it a good first solution when exploring dynamic programming problems with state transitions.
Approach 2: Sorting + Prefix Sum Optimized DP (Time: O((zero + one) log (zero + one) + zero * one), Space: O(zero * one))
A more optimized solution focuses on counting sequences without explicitly tracking every streak length. Instead of iterating over each possible extension one step at a time, prefix sums aggregate counts of valid transitions. You compute how many ways exist to end with a zero or one for each combination of used zeros and ones. Prefix ranges help subtract invalid configurations where the streak would exceed limit. Sorting can be used to normalize block expansions and process transitions in consistent order while prefix sums enable constant-time range queries. This reduces the number of repeated calculations and converts the naive streak iteration into efficient range updates, a common optimization technique in prefix sum based DP counting problems.
The DP table typically tracks two states: arrays ending in 0 and arrays ending in 1. For each cell, transitions add valid contributions from previous cells while subtracting sequences that would violate the consecutive limit. Prefix sums make these additions and removals constant-time instead of iterating over every possible streak length.
Recommended for interviews: Start with the memoized DP using a hash map to show you understand the state space and constraint handling. Interviewers usually expect the optimized DP with prefix sums afterward because it improves performance from streak-based iteration to aggregated transitions. Demonstrating both approaches shows strong command of DP state design and optimization techniques.
This approach utilizes a hash map to store elements as keys and their frequencies as values. It helps in quick look-up and insertion, providing an efficient way to solve the problem.
This C solution implements a simple hash map using a hash function to solve the frequency counting problem. We define a hash map with buckets and use a linked list to handle collisions. Each array element is processed by hashing it and incrementing its frequency in the hash map. The solution handles collisions using a chaining method.
Time Complexity: O(n) for processing n elements and O(1) average look-up time.
Space Complexity: O(n) for storing the elements in the hash map.
This approach involves sorting the array first, which groups identical elements together. After sorting, a single pass through the array is sufficient to count the frequencies of all elements.
This C solution sorts the array using the qsort function, making subsequent traversal straightforward for counting each element's frequency. After sorting, we iterate over the array, counting occurrences of each distinct element.
Time Complexity: O(n log n) for sorting and O(n) for traversal, leading to O(n log n).
Space Complexity: O(1) if sorting is in-place or O(n) otherwise.
We design a function dfs(i, j, k) to represent the number of stable binary arrays that satisfy the problem conditions when there are i 0s and j 1s left, and the next number to be filled is k. The answer is dfs(zero, one, 0) + dfs(zero, one, 1).
The calculation process of the function dfs(i, j, k) is as follows:
i < 0 or j < 0, return 0.i = 0, return 1 when k = 1 and j leq limit, otherwise return 0.j = 0, return 1 when k = 0 and i leq limit, otherwise return 0.k = 0, we consider the case where the previous number is 0, dfs(i - 1, j, 0), and the case where the previous number is 1, dfs(i - 1, j, 1). If the previous number is 0, it may cause more than limit 0s in the subarray, i.e., the situation where the limit + 1We can also convert the memoization search of Solution 1 into dynamic programming.
We define f[i][j][k] as the number of stable binary arrays using i 0s and j 1s, and the last number is k. So the answer is f[zero][one][0] + f[zero][one][1].
Initially, we have f[i][0][0] = 1, where 1 leq i leq min(limit, zero); and f[0][j][1] = 1, where 1 leq j leq min(limit, one).
The state transition equation is as follows:
f[i][j][0] = f[i - 1][j][0] + f[i - 1][j][1] - f[i - limit - 1][j][1].f[i][j][1] = f[i][j - 1][0] + f[i][j - 1][1] - f[i][j - limit - 1][0].The time complexity is O(zero times one), and the space complexity is O(zero times one).
| Approach | Complexity |
|---|---|
| Approach 1: Using a Hash Map | Time Complexity: O(n) for processing n elements and O(1) average look-up time. |
| Approach 2: Using Sorting | Time Complexity: O(n log n) for sorting and O(n) for traversal, leading to O(n log n). |
| Memoization Search | — |
| Dynamic Programming | — |
| Approach | Time | Space | When to Use |
|---|---|---|---|
| Hash Map Memoized DP | O(zero * one * limit) | O(zero * one * limit) | When building the solution step‑by‑step or explaining the full DP state in interviews |
| Sorting + Prefix Sum DP | O((zero + one) log (zero + one) + zero * one) | O(zero * one) | Preferred optimized solution when counting large numbers of sequences efficiently |
Find All Possible Stable Binary Arrays II | Derive From Part I | Detailed | Leetcode 3130 | MIK • codestorywithMIK • 10,489 views views
Watch 9 more video solutions →Practice Find All Possible Stable Binary Arrays II with our built-in code editor and test cases.
Practice on FleetCode