Watch 9 video solutions for Sorting Three Groups, a medium level problem involving Array, Binary Search, Dynamic Programming. This walkthrough by Prakhar Agrawal has 2,712 views views. Want to try solving it yourself? Practice on FleetCode or read the detailed text solution.
You are given an integer array nums. Each element in nums is 1, 2 or 3. In each operation, you can remove an element from nums. Return the minimum number of operations to make nums non-decreasing.
Example 1:
Input: nums = [2,1,3,2,1]
Output: 3
Explanation:
One of the optimal solutions is to remove nums[0], nums[2] and nums[3].
Example 2:
Input: nums = [1,3,2,1,3,3]
Output: 2
Explanation:
One of the optimal solutions is to remove nums[1] and nums[2].
Example 3:
Input: nums = [2,2,2,2,3,3]
Output: 0
Explanation:
nums is already non-decreasing.
Constraints:
1 <= nums.length <= 1001 <= nums[i] <= 3Follow-up: Can you come up with an algorithm that runs in
O(n) time complexity?Problem Overview: You receive an array where every value is 1, 2, or 3. The goal is to modify the minimum number of elements so the array becomes non‑decreasing, meaning all 1s come first, then 2s, then 3s. Each modification lets you change a number to any of the three values.
Approach 1: Greedy Counting Approach (O(n) time, O(1) space)
The final valid array must look like three consecutive segments: 1*, 2*, and 3*. Iterate through the array while counting how many elements violate each segment. For every position, track the minimum changes needed if the current prefix ends in group 1, 2, or 3. This effectively evaluates how many elements must be converted to maintain the sorted grouping. The key idea is that transitions only move forward (1 → 2 → 3), so you maintain running minimum costs rather than exploring all partitions. This greedy counting strategy runs in O(n) time with constant memory and works well because the domain of values is fixed to three groups. The solution relies heavily on simple iteration over an array while maintaining incremental counts.
Approach 2: Dynamic Programming with Subsequence (O(n) time, O(1) space)
Another way to view the task: keep the longest subsequence that already follows the order 1 → 2 → 3. The fewer elements you keep, the more modifications you must perform. Maintain three DP states: dp1 for subsequences ending with 1, dp2 for subsequences ending with 2, and dp3 for subsequences ending with 3. When processing each value, update these states using transitions that preserve the non‑decreasing order. For example, a 2 can extend either a 1 or 2 subsequence. The final answer becomes n - longest_valid_subsequence. This formulation is a compact dynamic programming pattern similar to constrained LIS problems and is easy to extend if the number of groups increases.
Some variations treat the problem as evaluating partition points with prefix counts, which resembles searching optimal boundaries between groups. That perspective connects loosely with prefix scanning techniques often seen in binary search style partition problems.
Recommended for interviews: The dynamic programming state transition is usually what interviewers expect. It shows you recognize the ordered subsequence structure and can maintain minimal states efficiently. The greedy counting formulation is equally optimal and often simpler to implement, but explaining the DP interpretation demonstrates stronger algorithmic reasoning.
| Approach | Time | Space | When to Use |
|---|---|---|---|
| Greedy Counting Approach | O(n) | O(1) | Best when values are limited to three groups and you want a simple linear pass with minimal state. |
| Dynamic Programming with Subsequence | O(n) | O(1) | Preferred in interviews to demonstrate ordered subsequence DP reasoning similar to LIS problems. |