Given an array of asynchronous functions functions and a pool limit n, return an asynchronous function promisePool. It should return a promise that resolves when all the input functions resolve.
Pool limit is defined as the maximum number promises that can be pending at once. promisePool should begin execution of as many functions as possible and continue executing new functions when old promises resolve. promisePool should execute functions[i] then functions[i + 1] then functions[i + 2], etc. When the last promise resolves, promisePool should also resolve.
For example, if n = 1, promisePool will execute one function at a time in series. However, if n = 2, it first executes two functions. When either of the two functions resolve, a 3rd function should be executed (if available), and so on until there are no functions left to execute.
You can assume all functions never reject. It is acceptable for promisePool to return a promise that resolves any value.
Example 1:
Input: functions = [ () => new Promise(res => setTimeout(res, 300)), () => new Promise(res => setTimeout(res, 400)), () => new Promise(res => setTimeout(res, 200)) ] n = 2 Output: [[300,400,500],500] Explanation: Three functions are passed in. They sleep for 300ms, 400ms, and 200ms respectively. They resolve at 300ms, 400ms, and 500ms respectively. The returned promise resolves at 500ms. At t=0, the first 2 functions are executed. The pool size limit of 2 is reached. At t=300, the 1st function resolves, and the 3rd function is executed. Pool size is 2. At t=400, the 2nd function resolves. There is nothing left to execute. Pool size is 1. At t=500, the 3rd function resolves. Pool size is zero so the returned promise also resolves.
Example 2:
Input: functions = [ () => new Promise(res => setTimeout(res, 300)), () => new Promise(res => setTimeout(res, 400)), () => new Promise(res => setTimeout(res, 200)) ] n = 5 Output: [[300,400,200],400] Explanation: The three input promises resolve at 300ms, 400ms, and 200ms respectively. The returned promise resolves at 400ms. At t=0, all 3 functions are executed. The pool limit of 5 is never met. At t=200, the 3rd function resolves. Pool size is 2. At t=300, the 1st function resolved. Pool size is 1. At t=400, the 2nd function resolves. Pool size is 0, so the returned promise also resolves.
Example 3:
Input: functions = [ () => new Promise(res => setTimeout(res, 300)), () => new Promise(res => setTimeout(res, 400)), () => new Promise(res => setTimeout(res, 200)) ] n = 1 Output: [[300,700,900],900] Explanation: The three input promises resolve at 300ms, 700ms, and 900ms respectively. The returned promise resolves at 900ms. At t=0, the 1st function is executed. Pool size is 1. At t=300, the 1st function resolves and the 2nd function is executed. Pool size is 1. At t=700, the 2nd function resolves and the 3rd function is executed. Pool size is 1. At t=900, the 3rd function resolves. Pool size is 0 so the returned promise resolves.
Constraints:
0 <= functions.length <= 101 <= n <= 10Problem Overview: You are given an array of functions where each function returns a Promise. The goal is to execute these functions while ensuring that at most n promises run at the same time. As soon as one finishes, the next function in the list should start. The returned promise resolves once all tasks complete.
Approach 1: Sequential Execution (O(n) time, O(1) space)
The most direct solution runs each async function one after another using await. Iterate through the array and wait for the current promise to resolve before starting the next. This guarantees the concurrency limit is never exceeded, but it wastes parallelism because only one task runs at a time even if the allowed concurrency is larger. This approach is simple but inefficient when tasks are independent and could run concurrently.
Approach 2: Fixed Batching (O(n) time, O(n) space)
Another strategy groups tasks into batches of size n. For each batch, start all functions and use Promise.all to wait for them to finish before moving to the next batch. This uses concurrency but has a drawback: if one promise in the batch finishes early, its slot stays idle until the entire batch completes. The runtime is still O(n), but throughput is suboptimal because available worker slots are not reused immediately.
Approach 3: Dynamic Promise Pool with Queue (O(n) time, O(n) space)
The optimal solution maintains a pool of up to n active promises. Start by launching the first n functions. Each time a promise resolves, immediately start the next function from the list. This creates a continuous pipeline where the concurrency limit is respected while maximizing utilization. Implementation typically tracks the next task index and an active counter. When a promise finishes, decrement the counter and schedule the next task. This pattern is common in systems that manage async workloads and mirrors a worker queue design.
This approach relies on basic concepts from async programming and JavaScript promises. The task list behaves like a lightweight queue, where each completion event triggers scheduling of the next job.
Recommended for interviews: The dynamic promise pool is the expected solution. Interviewers want to see that you can enforce concurrency limits while keeping workers busy. Mentioning the sequential or batching strategies shows you understand the constraints, but implementing the queue-based pool demonstrates practical async control and resource management.
TypeScript
| Approach | Time | Space | When to Use |
|---|---|---|---|
| Sequential Execution | O(n) | O(1) | When correctness matters more than parallelism or concurrency must effectively be 1 |
| Fixed Batching with Promise.all | O(n) | O(n) | When tasks can be grouped and strict batch execution is acceptable |
| Dynamic Promise Pool (Queue Scheduling) | O(n) | O(n) | Best general solution when maintaining a concurrency limit while maximizing throughput |
Promise Pool - Leetcode 2636 - JavaScript 30-Day Challenge • NeetCodeIO • 9,505 views views
Watch 9 more video solutions →Practice Promise Pool with our built-in code editor and test cases.
Practice on FleetCode