There are n piles of stones arranged in a row. The ith pile has stones[i] stones.
A move consists of merging exactly k consecutive piles into one pile, and the cost of this move is equal to the total number of stones in these k piles.
Return the minimum cost to merge all piles of stones into one pile. If it is impossible, return -1.
Example 1:
Input: stones = [3,2,4,1], k = 2 Output: 20 Explanation: We start with [3, 2, 4, 1]. We merge [3, 2] for a cost of 5, and we are left with [5, 4, 1]. We merge [4, 1] for a cost of 5, and we are left with [5, 5]. We merge [5, 5] for a cost of 10, and we are left with [10]. The total cost was 20, and this is the minimum possible.
Example 2:
Input: stones = [3,2,4,1], k = 3 Output: -1 Explanation: After any merge operation, there are 2 piles left, and we can't merge anymore. So the task is impossible.
Example 3:
Input: stones = [3,5,1,2,6], k = 3 Output: 25 Explanation: We start with [3, 5, 1, 2, 6]. We merge [5, 1, 2] for a cost of 8, and we are left with [3, 8, 6]. We merge [3, 8, 6] for a cost of 17, and we are left with [17]. The total cost was 25, and this is the minimum possible.
Constraints:
n == stones.length1 <= n <= 301 <= stones[i] <= 1002 <= k <= 30Problem Overview: You are given an array of stone piles and an integer k. Each move merges exactly k consecutive piles into one pile, costing the total number of stones in those piles. The goal is to minimize the total merge cost required to combine all piles into a single pile. If it is impossible due to the merge constraint, return -1.
Approach 1: Brute Force Recursive Partitioning (Exponential Time, O(n) space)
The most direct idea tries every valid way to merge k consecutive piles. For each interval, recursively compute the cost of merging left and right partitions, then add the sum of stones when the interval becomes one pile. This approach repeatedly recomputes the same subproblems and explores all possible partitions. Time complexity becomes exponential because each interval branches into multiple merge choices. Space complexity is O(n) for the recursion stack. This method mainly helps you understand the structure of the problem before applying optimization.
Approach 2: Top‑Down Dynamic Programming with Memoization (O(n^3) time, O(n^2) space)
Memoization avoids recomputing subproblems. Define dp(i, j) as the minimum cost to merge piles from index i to j. Only partitions that maintain the ability to merge into valid pile counts are explored. The recursion splits the range at valid positions and stores results in a memo table. To compute the merge cost quickly, use a prefix sum array so the total stones in any interval can be retrieved in O(1). This reduces the brute force explosion but still checks many interval partitions, giving O(n^3) time and O(n^2) space.
Approach 3: Bottom‑Up Dynamic Programming with Prefix Sums (O(n^3) time, O(n^2) space)
The standard optimal solution builds results for increasing interval lengths. Let dp[i][j] represent the minimum cost to merge piles between indices i and j. Iterate over interval lengths and try splitting the segment into two smaller intervals. Because merges only occur in groups of k, partitions advance by k‑1 steps to maintain valid pile counts. A prefix sum array lets you compute the total stones of an interval instantly when a final merge is possible. The algorithm iterates over start index, end index, and partition point, resulting in O(n^3) time and O(n^2) space. This method combines interval dynamic programming with efficient range sum queries on the array.
Recommended for interviews: The bottom‑up DP with prefix sums is the expected solution. Interviewers want to see interval DP reasoning: defining subproblems on ranges, enforcing the k-merge constraint, and using prefix sums to avoid repeated range summations. Starting from the brute force idea and then introducing memoization shows problem‑solving progression, but implementing the optimized DP demonstrates strong algorithmic maturity.
Use dynamic programming to minimize the cost. The idea is to store the minimum cost to merge stones from index i to j into p piles. We use prefix sums to calculate the cost of merging stones efficiently.
The C implementation uses a 2D dynamic programming table, dp[i][j], to store the minimum cost to merge piles from i to j. It also uses prefix sums to calculate the cost of merging efficiently.
Time Complexity: O(n^3). Space Complexity: O(n^2).
| Approach | Complexity |
|---|---|
| Dynamic Programming with Prefix Sums | Time Complexity: |
| Default Approach | — |
| Approach | Time | Space | When to Use |
|---|---|---|---|
| Brute Force Recursive Merging | Exponential | O(n) | Conceptual understanding of merge choices; impractical for large inputs |
| Top-Down DP with Memoization | O(n^3) | O(n^2) | Reduces recomputation while keeping recursive structure |
| Bottom-Up DP with Prefix Sums | O(n^3) | O(n^2) | Standard optimal solution for interval DP problems with range sums |
Leetcode : Minimum cost to merge stones (Dynamic Programming) • Shivank Goel • 22,899 views views
Watch 9 more video solutions →Practice Minimum Cost to Merge Stones with our built-in code editor and test cases.
Practice on FleetCode