You are given a 2D integer array lists, where each lists[i] is a non-empty array of integers sorted in non-decreasing order.
You may repeatedly choose two lists a = lists[i] and b = lists[j], where i != j, and merge them. The cost to merge a and b is:
len(a) + len(b) + abs(median(a) - median(b)), where len and median denote the list length and median, respectively.
After merging a and b, remove both a and b from lists and insert the new merged sorted list in any position. Repeat merges until only one list remains.
Return an integer denoting the minimum total cost required to merge all lists into one single sorted list.
The median of an array is the middle element after sorting it in non-decreasing order. If the array has an even number of elements, the median is the left middle element.
Example 1:
Input: lists = [[1,3,5],[2,4],[6,7,8]]
Output: 18
Explanation:
Merge a = [1, 3, 5] and b = [2, 4]:
len(a) = 3 and len(b) = 2median(a) = 3 and median(b) = 2cost = len(a) + len(b) + abs(median(a) - median(b)) = 3 + 2 + abs(3 - 2) = 6So lists becomes [[1, 2, 3, 4, 5], [6, 7, 8]].
Merge a = [1, 2, 3, 4, 5] and b = [6, 7, 8]:
len(a) = 5 and len(b) = 3median(a) = 3 and median(b) = 7cost = len(a) + len(b) + abs(median(a) - median(b)) = 5 + 3 + abs(3 - 7) = 12So lists becomes [[1, 2, 3, 4, 5, 6, 7, 8]], and total cost is 6 + 12 = 18.
Example 2:
Input: lists = [[1,1,5],[1,4,7,8]]
Output: 10
Explanation:
Merge a = [1, 1, 5] and b = [1, 4, 7, 8]:
len(a) = 3 and len(b) = 4median(a) = 1 and median(b) = 4cost = len(a) + len(b) + abs(median(a) - median(b)) = 3 + 4 + abs(1 - 4) = 10So lists becomes [[1, 1, 1, 4, 5, 7, 8]], and total cost is 10.
Example 3:
Input: lists = [[1],[3]]
Output: 4
Explanation:
Merge a = [1] and b = [3]:
len(a) = 1 and len(b) = 1median(a) = 1 and median(b) = 3cost = len(a) + len(b) + abs(median(a) - median(b)) = 1 + 1 + abs(1 - 3) = 4So lists becomes [[1, 3]], and total cost is 4.
Example 4:
Input: lists = [[1],[1]]
Output: 2
Explanation:
The total cost is len(a) + len(b) + abs(median(a) - median(b)) = 1 + 1 + abs(1 - 1) = 2.
Constraints:
2 <= lists.length <= 121 <= lists[i].length <= 500-109 <= lists[i][j] <= 109lists[i] is sorted in non-decreasing order.lists[i].length will not exceed 2000.Problem Overview: You are given several already sorted lists and can merge any two lists at a time. Merging two lists costs the total number of elements processed. The goal is to choose the merge order that minimizes the overall cost after all lists become one.
Approach 1: Brute Force Pair Simulation (O(k^3) time, O(k) space)
Try every possible pair of lists to merge at each step and recursively evaluate the total cost. After merging two lists, update the list set and repeat until only one list remains. Each merge itself uses the standard two pointers technique to combine sorted arrays in O(a + b) time. This explores all merge orders, which grows factorially with the number of lists. The approach demonstrates the core idea that merge order affects total cost, but it quickly becomes infeasible when the number of lists grows.
Approach 2: Bitmask Dynamic Programming (O(2^k * k^2) time, O(2^k) space)
Model the problem using dynamic programming over subsets. Each state dp[mask] represents the minimum cost to merge the subset of lists represented by that bitmask. To compute a state, split the subset into two smaller subsets, merge them, and add the cost equal to the combined length. Bit operations help efficiently represent subsets and transitions, connecting the solution to bit manipulation. This guarantees the optimal merge order but still scales poorly because every subset partition must be evaluated.
Approach 3: Greedy Min-Heap (Optimal Merge Pattern) (O(k log k) time, O(k) space)
The optimal strategy mirrors Huffman coding. Always merge the two smallest lists first. Store list sizes in a min-heap, repeatedly pop the two smallest values, merge them, add the merge cost to the total, and push the combined size back. The key insight: merging smaller lists earlier prevents large lists from being repeatedly reprocessed. Each heap operation takes O(log k), and the process runs k−1 times. The lists themselves are still merged using linear two-pointer passes, but the greedy order ensures the minimal total cost. In practice, this approach dominates because it directly minimizes the cumulative merge cost.
Recommended for interviews: Start by explaining the brute-force idea to show you understand why merge order matters. Then move to the greedy min-heap solution. Interviewers expect the optimal merge pattern because it reduces the problem to repeatedly selecting the smallest two lists, giving O(k log k) complexity and a clean implementation.
Solutions for this problem are being prepared.
Try solving it yourself| Approach | Time | Space | When to Use |
|---|---|---|---|
| Brute Force Pair Simulation | O(k^3) | O(k) | Conceptual understanding of how merge order affects cost; very small number of lists |
| Bitmask Dynamic Programming | O(2^k * k^2) | O(2^k) | When k is small and you want guaranteed optimal ordering via subset DP |
| Greedy Min-Heap (Optimal Merge Pattern) | O(k log k) | O(k) | Best general solution; minimizes cumulative merge cost efficiently |
Leetcode 3801 | Minimum Cost to Merge Sorted Lists |Leetcode weekly contest 483 | DP with Bitmasking • CodeWithMeGuys • 624 views views
Watch 5 more video solutions →Practice Minimum Cost to Merge Sorted Lists with our built-in code editor and test cases.
Practice on FleetCodePractice this problem
Open in Editor