Network Delay Time
A.题意
There are N network nodes, labelled 1 to N.
Given times, a list of travel times as directed edges times[i] = (u, v, w), where u is the source node, v is the target node, and w is the time it takes for a signal to travel from source to target.
Now, we send a signal from a certain node K. How long will it take for all nodes to receive the signal? If it is impossible, return -1.
Note:
1.N will be in the range [1, 100].
2.K will be in the range [1, N].
3.The length of times will be in the range [1, 6000].
4.All edges times[i] = (u, v, w) will have 1 <= u, v <= N and 1 <= w <= 100.
B.思路
很明显这道题是最短路问题,这里我用了bellman-ford算法,通过不断更新dist的值,最后可以获得最小值。
C.代码实现
class Solution {
public:
int networkDelayTime(vector<vector<int>>& times, int N, int K) {
int max_Distance = 100 * 100;
vector<int> dist(N, max_Distance);
dist[K - 1] = 0;
for (int i = 1; i < N; i++)
{
for (auto time : times)
{
int u = time[0] - 1, v = time[1] - 1, w = time[2];
dist[v] = min(dist[v], dist[u] + w);
}
}
int temp = *max_element(dist.begin(),dist.end());
return temp == max_Distance ? -1 : temp;
}
};