Sponsored
Sponsored
This approach involves creating a cumulative sum array based on the provided weights. The idea is to convert the weight array into a cumulative distribution, where each element represents the summed result of all previous weights including the current one. When we generate a random number, we search for its position in this cumulative array to determine which index to return.
Time Complexity: O(N) for initialization, O(N) for the pickIndex.
Space Complexity: O(N) due to the cumulative sum storage.
1#include <vector>
2#include <cstdlib>
3using namespace std;
4
5class Solution {
6 vector<int> prefixSum;
7public:
8 Solution(vector<int>& w) {
9 int sum = 0;
10 for (int weight : w) {
11 sum += weight;
12 prefixSum.push_back(sum);
13 }
14 }
15 int pickIndex() {
16 int totalWeight = prefixSum.back();
17 int target = rand() % totalWeight;
18 for (int i = 0; i < prefixSum.size(); i++) {
19 if (target < prefixSum[i]) {
20 return i;
21 }
22 }
23 return -1; // Should never be reached
24 }
25};
In C++, we use a vector to store the cumulative weights, making it easier to expand and manage memory. A random number is generated and iterated through the cumulative distribution to determine the index picked.
This optimized approach also uses a cumulative sum array, but instead of performing a linear search to find the appropriate index, we use a binary search. This greatly improves the efficiency when determining which index corresponds to a given cumulative value, especially beneficial for larger arrays.
Time Complexity: O(N) for initialization, O(log N) for pickIndex.
Space Complexity: O(N) for the cumulative sum array.
1
In JavaScript, a custom binary search locates the target index in the prefix sum array, which boosts the search efficiency compared to linear checking.