Network Delay Time
Last updated
Last updated
You are given a network of n
nodes, labeled from 1
to n
. You are also given times
, a list of travel times as directed edges times[i] = (ui, vi, wi)
, where ui
is the source node, vi
is the target node, and wi
is the time it takes for a signal to travel from source to target.
We will send a signal from a given node k
. Return the minimum time it takes for all the n
nodes to receive the signal. If it is impossible for all the n
nodes to receive the signal, return -1
.
Example 1:
Example 2:
Example 3:
Constraints:
1 <= k <= n <= 100
1 <= times.length <= 6000
times[i].length == 3
1 <= ui, vi <= n
ui != vi
0 <= wi <= 100
All the pairs (ui, vi)
are unique. (i.e., no multiple edges.)
This problem can be solved using Dijkstra’s algorithm, which is a famous algorithm for finding the shortest paths from a single source in a graph with non-negative edge weights.
The time complexity of this solution is O(E log E), where E is the number of edges (or the length of times), because in the worst case we need to iterate over each edge and update the queue. The space complexity is O(N + E), where N is the number of nodes, because we need to store the graph (O(E)) and the distance of each node (O(N)).