You are given a 0-indexed integer array nums.
You can perform any number of operations, where each operation involves selecting a subarray of the array and replacing it with the sum of its elements. For example, if the given array is [1,3,5,6] and you select subarray [3,5] the array will convert to [1,8,6].
Return the maximum length of a non-decreasing array that can be made after applying operations.
A subarray is a contiguous non-empty sequence of elements within an array.
Example 1:
Input: nums = [5,2,2] Output: 1 Explanation: This array with length 3 is not non-decreasing. We have two ways to make the array length two. First, choosing subarray [2,2] converts the array to [5,4]. Second, choosing subarray [5,2] converts the array to [7,2]. In these two ways the array is not non-decreasing. And if we choose subarray [5,2,2] and replace it with [9] it becomes non-decreasing. So the answer is 1.
Example 2:
Input: nums = [1,2,3,4] Output: 4 Explanation: The array is non-decreasing. So the answer is 4.
Example 3:
Input: nums = [4,3,2,6] Output: 3 Explanation: Replacing [3,2] with [5] converts the given array to [4,5,6] that is non-decreasing. Because the given array is not non-decreasing, the maximum possible answer is 3.
Constraints:
1 <= nums.length <= 1051 <= nums[i] <= 105Problem Overview: You are given an integer array and may repeatedly merge adjacent elements into a single value equal to their sum. The goal is to maximize the length of the resulting array such that the final sequence is non‑decreasing. The challenge is deciding where to merge so each segment sum is at least the previous one.
Approach 1: Simple Greedy Segment Merging (O(n log n) time, O(n) space)
This approach processes the array left to right while maintaining segment sums. Start by treating every element as its own segment. If the current segment sum becomes smaller than the previous segment sum, merge segments until the non‑decreasing property holds again. A stack-like structure keeps track of segment sums. When a violation occurs, pop previous segments, combine their sums, and push the merged result back. This greedy merging works because once a segment violates the order, the only fix is merging with previous segments to increase its value. The method relies on properties similar to a monotonic stack. Time complexity is O(n log n) or amortized close to O(n) depending on merge operations, with O(n) space for storing segment sums.
Approach 2: Dynamic Programming with Prefix Sums and Monotonic Queue (O(n) time, O(n) space)
The optimal strategy models the problem using dynamic programming. Let dp[i] represent the maximum number of valid segments considering the first i elements. Use prefix sums so the sum of any segment (j+1..i) can be computed in O(1). The transition checks whether the new segment sum is at least the previous segment’s sum. To avoid O(n²) comparisons, maintain candidate states in a monotonic structure based on the minimum valid previous segment sum. A monotonic queue helps quickly discard suboptimal states while preserving the best transition for each position.
The key insight is that each DP state can be represented by the minimum possible last segment sum that achieves a certain number of segments. As the array grows, earlier states that can never produce a better result are removed from the queue. This keeps transitions constant time on average, producing an overall O(n) solution with O(n) additional space.
Recommended for interviews: Interviewers expect the dynamic programming solution with prefix sums and a monotonic optimization. The greedy merging idea helps build intuition about why segments must sometimes collapse into larger sums, but the optimized DP shows stronger algorithmic control and handles worst‑case inputs efficiently. Demonstrating both ideas—first the greedy intuition, then the O(n) DP—signals strong problem‑solving depth.
The idea is to iterate over the array and keep track of the longest non-decreasing subarray found so far. Merge only when necessary to maintain non-decreasing order.
We iterate through the array while keeping track of the current and maximum lengths of non-decreasing sequences. If the sequence breaks, we reset the current length.
Time Complexity: O(n), Space Complexity: O(1)
Use a DP table to store the maximum length of non-decreasing subarrays ending at each position. This uses the relation to previous subarray lengths.
Dynamic programming table `dp` is used to track the length of the non-decreasing subarray ending at each element.
Time Complexity: O(n), Space Complexity: O(n)
Python
Java
C++
Go
TypeScript
| Approach | Complexity |
|---|---|
| Simple Greedy Approach | Time Complexity: O(n), Space Complexity: O(1) |
| Dynamic Programming Approach | Time Complexity: O(n), Space Complexity: O(n) |
| Default Approach | — |
| Approach | Time | Space | When to Use |
|---|---|---|---|
| Greedy Segment Merging (Stack) | O(n log n) amortized | O(n) | Good for building intuition and implementing quickly when constraints are moderate. |
| Dynamic Programming + Prefix Sum + Monotonic Queue | O(n) | O(n) | Best for large inputs and interview settings where optimal performance is required. |
2945. Find Maximum Non-decreasing Array Length | Leetcode Biweekly 118 • codingMohan • 4,510 views views
Watch 3 more video solutions →Practice Find Maximum Non-decreasing Array Length with our built-in code editor and test cases.
Practice on FleetCode