Sponsored
Sponsored
The approach involves using Dijkstra's algorithm but instead of finding the shortest path, we need to find the path with the maximum product of probabilities. This can be achieved by using a max-heap (or priority queue) to always extend the path with the largest accumulated probability of success. The algorithm initializes a maximum probability set for each node starting from the start
node with a probability of 1, then iteratively updates the probabilities by considering the edges connected to the current node, updating if a path with higher probability is found.
Time Complexity: O(E log V), where E is the number of edges and V is the number of vertices, due to the use of the priority queue.
Space Complexity: O(V + E), for storing the graph and additional data structures like the probability set and heap.
using System.Collections.Generic;
public class Solution {
public double MaxProbability(int n, int[][] edges, double[] succProb, int start, int end) {
List<List<Tuple<int, double>>> graph = new List<List<Tuple<int, double>>>();
for (int i = 0; i < n; i++) {
graph.Add(new List<Tuple<int, double>>());
}
for (int i = 0; i < edges.Length; i++) {
int u = edges[i][0], v = edges[i][1];
double prob = succProb[i];
graph[u].Add(Tuple.Create(v, prob));
graph[v].Add(Tuple.Create(u, prob));
}
double[] maxProb = new double[n];
maxProb[start] = 1.0;
PriorityQueue<Tuple<double, int>, double> pq = new();
pq.Enqueue(Tuple.Create(1.0, start), -1.0);
while (pq.Count > 0) {
var (currProb, node) = pq.Dequeue();
currProb *= -1;
if (node == end) return currProb;
foreach (var (nextNode, edgeProb) in graph[node]) {
if (maxProb[nextNode] < currProb * edgeProb) {
maxProb[nextNode] = currProb * edgeProb;
pq.Enqueue(Tuple.Create(maxProb[nextNode], nextNode), -maxProb[nextNode]);
}
}
}
return 0.0;
}
}
The C# implementation uses a priority queue from .NET's collections to handle the nodes based on their probabilities, prioritizing nodes with the high probability of being part of the maximum product path. This logic requires the transformation of probability to a negative value before enqueuing to simulate handling of a max-heap. This implementation borrows principles from graph theory specifically related to weighted path calculations but adjusted for probabilities instead of traditional distance metrics.
The Bellman-Ford algorithm traditionally calculates shortest paths in a weighted graph and can be adapted here to maximize a product instead. Given the properties of logarithms, maximizing the product of probabilities can be converted to minimizing the sum of the negative logarithm values, allowing direct use of Bellman-Ford's relaxation principle. This transformation reduces the problem of maximizing path probabilities to a more conventional structure that minimization algorithms handle well, iterating across all edges and vertices.
Time Complexity: O(V * E), where V is the number of vertices and E is the number of edges in the graph, each edge potentially causing updates across V-1 iterations.
Space Complexity: O(V), for storing probability values and log-converted results for paths during processing.
1import math
2
3class Solution:
4 def maxProbability(self, n, edges, succProb, start, end):
5 logProb = [-math.inf] * n
6 logProb[start] = 0
7
8 for _ in range(n - 1):
9 updated = False
10 for (u, v), prob in zip(edges, succProb):
11 lg = math.log(prob)
12 if logProb[u] != -math.inf and logProb[u] + lg > logProb[v]:
13 logProb[v] = logProb[u] + lg
14 updated = True
15 if logProb[v] != -math.inf and logProb[v] + lg > logProb[u]:
16 logProb[u] = logProb[v] + lg
17 updated = True
18 if not updated:
19 break
20
21 return math.exp(logProb[end]) if logProb[end] != -math.inf else 0.0
This Python solution adapts the Bellman-Ford algorithm by carrying out edge relaxation iteratively, updating the path if a higher probability (converted to a lower negative log value) is discovered. It calculates the log-probability instead of direct product and converts back the resultant path strength using exponentiation. The algorithm stops early if no updates are performed in an iteration, indicating convergence to optimal.