Dijkstra's algorithm: Difference between revisions

Content deleted Content added
Fixing harv/sfn error. Please watchlist Category:Harv and Sfn no-target errors and install User:Trappist the monk/HarvErrors.js to help you spot such errors when reading and editing.
 
(39 intermediate revisions by 30 users not shown)
Line 3:
{{Use dmy dates|date=December 2019}}
{{Infobox algorithm|class=[[Search algorithm]]<br>[[Greedy algorithm]]<br>[[Dynamic programming]]<ref>Controversial, see {{cite journal|author1=Moshe Sniedovich|title=Dijkstra's algorithm revisited: the dynamic programming connexion|journal=Control and Cybernetics|date=2006|volume=35|pages=599–620|url=https://www.infona.pl/resource/bwmeta1.element.baztech-article-BAT5-0013-0005/tab/summary}} and [[#Dynamic programming perspective|below part]].</ref>|image=Dijkstra Animation.gif|caption=Dijkstra's algorithm to find the shortest path between ''a'' and ''b''. It picks the unvisited vertex with the lowest distance, calculates the distance through it to each unvisited neighbor, and updates the neighbor's distance if smaller. Mark visited (set to red) when done with neighbors.|data=[[Graph (data structure)|Graph]]<br>Usually used with [[priority queue]] or [[Heap (data structure)|heap]] for optimization{{sfn|Cormen|Leiserson|Rivest|Stein|2001}}{{sfn|Fredman|Tarjan|1987}}|time=<math>\Theta(|E| + |V| \log|V|)</math>{{sfn|Fredman|Tarjan|1987}}|best-time=|average-time=|space=|optimal=|complete=}}
'''Dijkstra's algorithm''' ({{IPAc-en|ˈ|d|aɪ|k|s|t|r|ə|z}} {{respell|DYKE|strəz}}) is an [[algorithm]] for finding the [[Shortest path problem|shortest paths]] between [[Vertex (graph theory)|nodes]] in a weighted [[Graph (abstract data type)|graph]], which may represent, for example, a [[road network]]. It was conceived by [[computer scientist]] [[Edsger W. Dijkstra]] in 1956 and published three years later.<ref>{{cite web |last=Richards |first=Hamilton |title=Edsger Wybe Dijkstra |url=http://amturing.acm.org/award_winners/dijkstra_1053701.cfm |access-date=16 October 2017 |website=A.M. Turing Award |publisher=Association for Computing Machinery |quote=At the Mathematical Centre a major project was building the ARMAC computer. For its official inauguration in 1956, Dijkstra devised a program to solve a problem interesting to a nontechnical audience: Given a network of roads connecting cities, what is the shortest route between two designated cities?}}</ref><ref name="Dijkstra Interview2">{{cite journal |last=Frana |first=Phil |date=August 2010 |title=An Interview with Edsger W. Dijkstra |journal=Communications of the ACM |volume=53 |issue=8 |pages=41–47 |doi=10.1145/1787234.1787249 |s2cid=27009702 |doi-access=}}</ref><ref name="Dijkstra19592">{{cite journal |last1=Dijkstra |first1=E. W. |author-link=Edsger W. Dijkstra |year=1959 |title=A note on two problems in connexion with graphs |url=https://citeseerxir.istcwi.psu.edunl/viewdocpub/download;jsessionid=40368327ACB1D1FFF45671886D563916?doi=109256/9256D.1.1.165.7577&rep=rep1&type=pdf |journal=Numerische Mathematik |volume=1 |pages=269–271 |citeseerx=10.1.1.165.7577 |doi=10.1007/BF01386390 |s2cid=123284777}}</ref>
Dijkstra's algorithm finds the shortest path from a given source node to every other node.<ref name="mehlhorn">{{cite book |last1=Mehlhorn |first1=Kurt |author1-link=Kurt Mehlhorn |title=Algorithms and Data Structures: The Basic Toolbox |last2=Sanders |first2=Peter |author2-link=Peter Sanders (computer scientist) |publisher=Springer |year=2008 |isbn=978-3-540-77977-3 |chapter=Chapter 10. Shortest Paths |doi=10.1007/978-3-540-77978-0 |chapter-url=http://people.mpi-inf.mpg.de/~mehlhorn/ftp/Toolbox/ShortestPaths.pdf}}</ref>{{rp|pages=196–206}} It can be used to find the shortest path to a specific destination node, by terminating the algorithm after determining the shortest path to the destination node. For example, if the nodes of the graph represent cities, and the costs of edges represent the average distances between pairs of cities connected by a direct road, then Dijkstra's algorithm can be used to find the shortest route between one city and all other cities. A common application of shortest path algorithms is network [[Routing protocol|routing protocolsprotocol]]s, most notably [[IS-IS]] (Intermediate System to Intermediate System) and [[Open Shortest Path First|OSPF]] (Open Shortest Path First). It is also employed as a [[subroutine]] in algorithms such as [[Johnson's algorithm]].
 
The algorithm uses a [[Priority queue#Min-priority queue|min-priority queue]] [[data structure]] for selecting the shortest paths known so far. Before more advanced priority queue structures were discovered, Dijkstra's original algorithm ran in <math>\Theta(|V|^2)</math> [[Time complexity|time]], where <math>|V|</math> is the number of nodes.<ref>{{Cite journalbook |last=Schrijver |first=Alexander |datetitle=2012Optimization Stories |titlechapter=On the history of the shortest path problem. |date=2012 |chapter-url=http://ftp.gwdg.de/pub/misc/EMIS/journals/DMJDMV/vol-ismp/32_schrijver-alexander-sp.pdf |journal=Documenta Mathematica |series=Documenta Mathematica Series |volume=6 |pages=155–167 |doi=10.4171/dms/6/19 |doi-access=free |isbn=978-3-936609-58-5}}</ref>{{sfn|Leyzorek|Gray|Johnson|Ladew|1957}} {{harvnb|Fredman|Tarjan|1984}} proposed a [[Fibonacci heap]] priority queue to optimize the running time complexity to <math>\Theta(|E|+|V|\log|V|)</math>. This is [[Asymptotic computational complexity|asymptotically]] the fastest known single-source [[Shortest path problem|shortest-path algorithm]] for arbitrary [[Directeddirected graph|directed graphs]]s with unbounded non-negative weights. However, specialized cases (such as bounded/integer weights, directed acyclic graphs etc.) can be [[Dijkstra's algorithm#Specialized variants|improved further]]. If preprocessing is allowed, algorithms such as [[Contraction hierarchy|contraction hierarchies]] can be up to seven [[orders of magnitude]] faster.
 
Dijkstra's algorithm is commonly used on graphs where the edge weights are positive integers or real numbers. It can be generalized to any graph where the edge weights are [[Partially ordered set|partially ordered]], provided the subsequent labels (a subsequent label is produced when traversing an edge) are [[Monotonic function|monotonically]] non-decreasing.<ref name="Generic Dijkstra2">{{cite journal |last1=Szcześniak |first1=Ireneusz |last2=Jajszczyk |first2=Andrzej |last3=Woźna-Szcześniak |first3=Bożena |year=2019 |title=Generic Dijkstra for optical networks |journal=Journal of Optical Communications and Networking |volume=11 |issue=11 |pages=568–577 |arxiv=1810.04481 |doi=10.1364/JOCN.11.000568 |s2cid=52958911}}</ref><ref name="Generic Dijkstra correctness2">{{citation |last1=Szcześniak |first1=Ireneusz |title=NOMS 2023-2023 IEEE/IFIP Network Operations and Management Symposium |pages=1–7 |year=2023 |chapter=Generic Dijkstra: Correctness and tractability |arxiv=2204.13547 |doi=10.1109/NOMS56928.2023.10154322 |isbn=978-1-6654-7716-1 |s2cid=248427020 |last2=Woźna-Szcześniak |first2=Bożena}}</ref>
Line 18:
 
== Algorithm ==
[[File:Dijkstras_progress_animation.gif|thumb|Illustration of Dijkstra's algorithm finding a path from a start node (lower left, red) to a target node (upper right, green) in a [[Robotics|robot]] [[motion planning]] problem. Open nodes represent the "tentative" set (aka set of "unvisited" nodes). Filled nodes are the visited ones, with color representing the distance: the greenerredder, the closer (to the start node). Nodes in all the different directions are explored uniformly, appearing more-or-less as a circular [[wavefront]] as Dijkstra's algorithm uses a [[Consistent heuristic|heuristic]] of picking the shortest known path so far.]]
The algorithm requires a starting node, and nodecomputes ''N,''the with ashortest distance betweenfrom thethat starting node andto ''N''each other node. Dijkstra's algorithm starts with infinite distances and tries to improve them step by step:
 
# Create a [[Set (abstract data type)|set]] of all unvisited nodes: the unvisited set.
Line 25:
# From the unvisited set, select the current node to be the one with the smallest (finite) distance; initially, this is the starting node (distance zero). If the unvisited set is empty, or contains only nodes with infinite distance (which are unreachable), then the algorithm terminates by skipping to step 6. If the only concern is the path to a target node, the algorithm terminates once the current node is the target node. Otherwise, the algorithm continues.
# For the current node, consider all of its unvisited neighbors and update their distances through the current node; compare the newly calculated distance to the one currently assigned to the neighbor and assign the smaller one to it. For example, if the current node ''A'' is marked with a distance of 6, and the edge connecting it with its neighbor ''B'' has length 2, then the distance to ''B'' through ''A'' is 6 + 2 = 8. If B was previously marked with a distance greater than 8, then update it to 8 (the path to B through A is shorter). Otherwise, keep its current distance (the path to B through A is not the shortest).
# After considering all of the current node's unvisited neighbors, the current node is removed from the unvisited set. Thus a visited node is never rechecked, which is correct because the distance recorded on the current node is minimal (as ensured in step 3), and thus final. Repeat from to step 3.
# Once the loop exits (steps 3–5), every visited node contains its shortest distance from the starting node.
 
Line 39:
In the following [[pseudocode]], {{mono|dist}} is an array that contains the current distances from the {{mono|<var>source</var>}} to other vertices, i.e. {{mono|dist[<var>u</var>]}} is the current distance from the source to the vertex {{mono|<var>u</var>}}. The {{mono|prev}} array contains pointers to previous-hop nodes on the shortest path from source to the given vertex (equivalently, it is the ''next-hop'' on the path ''from'' the given vertex ''to'' the source). The code {{mono|u ← vertex in ''Q'' with min dist[u]}}, searches for the vertex {{mono|<var>u</var>}} in the vertex set {{mono|<var>Q</var>}} that has the least {{mono|dist[<var>u</var>]}} value. {{mono|Graph.Edges(<var>u</var>, <var>v</var>)}} returns the length of the edge joining (i.e. the distance between) the two neighbor-nodes {{mono|<var>u</var>}} and {{mono|<var>v</var>}}. The variable {{mono|<var>alt</var>}} on line 14 is the length of the path from the {{mono|<var>source</var>}} node to the neighbor node {{mono|<var>v</var>}} if it were to go through {{mono|<var>u</var>}}. If this path is shorter than the current shortest path recorded for {{mono|<var>v</var>}}, then the distance of {{mono|<var>v</var>}} is updated to {{mono|<var>alt</var>}}.<ref name=" mehlhorn" />
[[File:DijkstraDemo.gif|thumb|A demo of Dijkstra's algorithm based on Euclidean distance. Red lines are the shortest path covering, i.e., connecting ''u'' and prev[''u'']. Blue lines indicate where relaxing happens, i.e., connecting ''v'' with a node ''u'' in ''Q'', which gives a shorter path from the source to ''v''.]]
 
1 '''function''' Dijkstra(''Graph'', ''source''):
2
2
3 '''for each''' vertex ''v'' in ''Graph.Vertices'':
4 dist[''v''] ← INFINITY
5 prev[''v''] ← UNDEFINED
6 add ''v'' to ''Q''
7 dist[''source''] ← 0
8
9 '''while''' ''Q'' is not empty:
10 ''u'' ← vertex in ''Q'' with minimum dist[u]
11 remove u from '' Q''.remove(u)
12
13 '''for each''' neighborarc ''v'' of ''(u'', stillv) in ''Q'':
14 ''alt'' ← dist[''u''] + Graph.Edges(''u'', ''v'')
15 '''if''' ''alt'' < dist[''v'']:
16 dist[''v''] ← ''alt''
17 prev[''v''] ← ''u''
18
19 '''return''' dist[], prev[]
 
To find the shortest path between vertices {{mono|<var>source</var>}} and {{mono|<var>target</var>}}, the search terminates after line 10 if {{mono|<var>u</var> {{=}} <var>target</var>}}. The shortest path from {{mono|<var>source</var>}} to {{mono|<var>target</var>}} can be obtained by reverse iteration:
 
1 ''S'' ← empty sequence
21 ''uS'' ← ''target''empty sequence
2 ''u'' ← ''target''
3 '''if''' prev[''u''] is defined '''or''' ''u'' = ''source'': ''// Do something onlyProceed if the vertex is reachable''
4 '''while''' ''u'' is defined: ''// Construct the shortest path with a stack S''
5 insert ''u'' at the beginning of '' S''.push(u) ''// Push the vertex onto the stack''
6 ''u'' ← prev[''u''] ''// Traverse from target to source''
 
Now sequence {{mono|<var>S</var>}} is the list of vertices constituting one of the shortest paths from {{mono|<var>source</var>}} to {{mono|<var>target</var>}}, or the empty sequence if no path exists.
 
Line 71 ⟶ 75:
===Using a priority queue===
A min-priority queue is an abstract data type that provides 3 basic operations: {{mono|add_with_priority()}}, {{mono|decrease_priority()}} and {{mono|extract_min()}}. As mentioned earlier, using such a data structure can lead to faster computing times than using a basic queue. Notably, [[Fibonacci heap]]{{sfn|Fredman|Tarjan|1984}} or [[Brodal queue]] offer optimal implementations for those 3 operations. As the algorithm is slightly different in appearance, it is mentioned here, in pseudocode as well:
 
1 '''function''' Dijkstra(''Graph'', ''source''):
2 create vertex priority queue Q
2 Q ← Queue storing vertex priority
3
3
4 dist[''source''] ← 0 ''// Initialization''
5 ''Q''.add_with_priority(''source'', 0) ''// associated priority equals dist[·]''
6
7 '''for each''' vertex ''v'' in ''Graph.Vertices'':
8 '''if''' ''v'' ≠ ''source''
9 prev[''v''] ← UNDEFINED ''// Predecessor of v''
10 dist[''v''] ← INFINITY ''// Unknown distance from source to v''
11 Q.add_with_priority(v, INFINITY)
12
13
14 '''while''' ''Q'' is not empty: ''// The main loop''
15 ''u'' ← ''Q''.extract_min() ''// Remove and return best vertex''
16 '''for each''' neighborarc (u, ''v'' of) ''u'': ''// Go through all v neighbors of u''
17 ''alt'' ← dist[''u''] + Graph.Edges(''u'', ''v'')
18 '''if''' ''alt'' < dist[''v'']:
19 prev[''v''] ← ''u''
20 dist[''v''] ← ''alt''
21 ''Q''.decrease_priority(''v'', ''alt'')
22
23 '''return''' (dist, prev)
 
Instead of filling the priority queue with all nodes in the initialization phase, it is possible to initialize it to contain only ''source''; then, inside the <code>'''if''' ''alt'' < dist[''v'']</code> block, the {{mono|decrease_priority()}} becomes an {{mono|add_with_priority()}} operation.<ref name=" mehlhorn" />{{rp|198}}
 
Line 114 ⟶ 120:
 
* In the former case, let {{mono|w}} be the first unvisited node on this shorter path. By induction, the shortest paths from {{mono|source}} to {{mono|u}} and {{mono|w}} through visited nodes only have costs {{code|dist[u]}} and {{code|dist[w]}} respectively. This means the cost of going from {{mono|source}} to {{mono|u}} via {{mono|w}} has the cost of at least {{code|dist[w]}} + the minimal cost of going from {{mono|w}} to {{mono|u}}. As the edge costs are positive, the minimal cost of going from {{mono|w}} to {{mono|u}} is a positive number. However, {{code|dist[u]}} is at most {{code|dist[w]}} because otherwise w would have been picked by the priority queue instead of u. This is a contradiction, since it has already been established that {{code|dist[w]}} + a positive number < {{code|dist[u]}}.
 
* In the latter case, let {{mono|w}} be the last but one node on the shortest path. That means {{code|dist[w] + Graph.Edges[w,u] < dist[u]}}. That is a contradiction because by the time {{mono|w}} is visited, it should have set {{code|dist[u]}} to at most {{code|dist[w] + Graph.Edges[w,u]}}.
 
Line 126 ⟶ 131:
Bounds of the running time of Dijkstra's algorithm on a graph with edges ''{{mvar|E}}'' and vertices ''{{mvar|V}}'' can be expressed as a function of the number of edges, denoted <math>|E|</math>, and the number of vertices, denoted <math>|V|</math>, using [[big-O notation]]. The complexity bound depends mainly on the data structure used to represent the set ''{{mvar|Q}}''. In the following, upper bounds can be simplified because <math>|E|</math> is <math>O(|V|^2)</math> for any simple graph, but that simplification disregards the fact that in some problems, other upper bounds on <math>|E|</math> may hold.
 
For any data structure for the vertex set ''{{mvar|Q}}'', the running time i sis:{{sfn|Cormen|Leiserson|Rivest|Stein|2001}}
 
: <math>\Theta(|E| \cdot T_\mathrm{dk} + |V| \cdot T_\mathrm{em}),</math>
Line 134 ⟶ 139:
The simplest version of Dijkstra's algorithm stores the vertex set ''{{mvar|Q}}'' as a linked list or array, and edges as an [[adjacency list]] or [[Adjacency matrix|matrix]]. In this case, extract-minimum is simply a linear search through all vertices in ''{{mvar|Q}}'', so the running time is <math>\Theta(|E| + |V|^2) = \Theta(|V|^2)</math>.
 
For [[Sparsesparse graph|sparse graphs]]s, that is, graphs with far fewer than <math>|V|^2</math> edges, Dijkstra's algorithm can be implemented more efficiently by storing the graph in the form of adjacency lists and using a [[self-balancing binary search tree]], [[binary heap]], [[pairing heap]], [[Fibonacci heap]] or a priority heap as a [[priority queue]] to implement extracting minimum efficiently. To perform decrease-key steps in a binary heap efficiently, it is necessary to use an auxiliary data structure that maps each vertex to its position in the heap, and to update this structure as the priority queue ''{{mvar|Q}}'' changes. With a self-balancing binary search tree or binary heap, the algorithm requires
 
: <math>\Theta((|E| + |V|) \log |V|)</math>
Line 151 ⟶ 156:
Moreover, not inserting all nodes in a graph makes it possible to extend the algorithm to find the shortest path from a single source to the closest of a set of target nodes on infinite graphs or those too large to represent in memory. The resulting algorithm is called ''uniform-cost search'' (UCS) in the artificial intelligence literature{{r|felner}}<ref name="aima">{{Cite AIMA|3|pages=75, 81}}</ref><ref>Sometimes also ''least-cost-first search'': {{cite journal |last=Nau |first=Dana S. |year=1983 |title=Expert computer systems |url=https://www.cs.umd.edu/~nau/papers/nau1983expert.pdf |journal=Computer |publisher=IEEE |volume=16 |issue=2 |pages=63–85 |doi=10.1109/mc.1983.1654302 |s2cid=7301753}}</ref> and can be expressed in pseudocode as<!--NOTE TO EDITORS: Don't get confused by "uniform" in the name and remove costs from the pseudocode. UCS includes costs and is different from breadth-first search.-->
'''procedure''' uniform_cost_search(start) '''is'''
node ← start
frontier ← priority queue containing node only
expanded ← empty set
'''do'''
'''if''' frontier is empty '''then'''
'''return''' failure
node ← frontier.pop()
'''if''' node is a goal state '''then'''
'''return''' solution(node)
expanded.add(node)
'''for each''' of node's neighbors ''n'' '''do'''
'''if''' ''n'' is not in expanded and not in frontier '''then'''
frontier.add(''n'')
'''else if''' ''n'' is in frontier with higher cost
replace existing node with ''n''
 
Its complexity can be expressed in an alternative way for very large graphs: when {{math|''C''<sup>*</sup>}} is the length of the shortest path from the start node to any node satisfying the "goal" predicate, each edge has cost at least ''{{mvar|ε}}'', and the number of neighbors per node is bounded by ''{{mvar|b}}'', then the algorithm's worst-case time and space complexity are both in {{math|''O''(''b''<sup>1+⌊''C''<sup>*</sup> {{frac}} ''ε''⌋</sup>)}}.{{r|aima}}
 
Further optimizations for the single-target case include [[Bidirectional search|bidirectional]] variants, goal-directed variants such as the [[A* algorithm]] (see {{slink||Related problems and algorithms}}), graph pruning to determine which nodes are likely to form the middle segment of shortest paths (reach-based routing), and hierarchical decompositions of the input graph that reduce {{math|''s''–''t''}} routing to connecting ''{{mvar|s}}'' and ''{{mvar|t}}'' to their respective "[[Transit Node Routing|transit nodes]]" followed by shortest-path computation between these transit nodes using a "highway".<ref name="speedup2">{{cite conference |last1=Wagner |first1=Dorothea |last2=Willhalm |first2=Thomas |year=2007 |title=Speed-up techniques for shortest-path computations |conference=STACS |pages=23–36}}</ref> Combinations of such techniques may be needed for optimal practical performance on specific problems.<ref>{{cite journal |last1=Bauer |first1=Reinhard |last2=Delling |first2=Daniel |last3=Sanders |first3=Peter |last4=Schieferdecker |first4=Dennis |last5=Schultes |first5=Dominik |last6=Wagner |first6=Dorothea |year=2010 |title=Combining hierarchical and goal-directed speed-up techniques for Dijkstra's algorithm |url=https://publikationen.bibliothek.kit.edu/1000014952 |journal=J. ACM Journal of Experimental Algorithmics |volume=15 |pages=2.1 |doi=10.1145/1671970.1671976 |s2cid=1661292}}</ref>
 
=== Optimality for comparison-sorting by distance ===
As well as simply computing distances and paths, Dijkstra's algorithm can be used to sort vertices by their distances from a given starting vertex.
In 2023, Haeupler, Rozhoň, Tětek, Hladík, and [[Robert Tarjan|Tarjan]] (one of the inventors of the 1984 heap), proved that, for this sorting problem on a positively-weighted directed graph, a version of Dijkstra's algorithm with a special heap data structure has a runtime and number of comparisons that is within a constant factor of optimal among [[Comparison sort|comparison-based]] algorithms for the same sorting problem on the same graph and starting vertex but with variable edge weights. To achieve this, they use a comparison-based heap whose cost of returning/removing the minimum element from the heap is logarithmic in the number of elements inserted after it rather than in the number of elements in the heap.<ref>{{Citationcite arXiv |lastlast1=Haeupler |firstfirst1=Bernhard |title=Universal Optimality of Dijkstra via Beyond-Worst-Case Heaps |date=2024-10-28 |urleprint=https://arxiv.org/abs/2311.11793 |access-date=2024-12-09 |doi=10.48550/arXiv.2311.11793 |last2=Hladík |first2=Richard |last3=Rozhoň |first3=Václav |last4=Tarjan |first4=Robert |last5=Tětek |first5=Jakub|class=cs.DS }}</ref><ref>{{Cite web |last=Brubaker |first=Ben |date=2024-10-25 |title=Computer Scientists Establish the Best Way to Traverse a Graph |url=https://www.quantamagazine.org/computer-scientists-establish-the-best-way-to-traverse-a-graph-20241025/ |access-date=2024-12-09 |website=Quanta Magazine |language=en}}</ref>
 
===Specialized variants===
When arc weights are small integers (bounded by a parameter <math>C</math>), specialized queues can be used for increased speed. The first algorithm of this type was Dial's algorithm{{Sfn|Dial|1969}} for graphs with positive integer edge weights, which uses a [[bucket queue]] to obtain a running time <math>O(|E|+|V|C)</math>. The use of a [[Van Emde Boas tree]] as the priority queue brings the complexity to <math>O(|E|+|V|\log C/\log\log |V|C)</math> .{{sfn|Ahuja|Mehlhorn|Orlin|Tarjan|1990}}. Another interesting variant based on a combination of a new [[radix heap]] and the well-known Fibonacci heap runs in time <math>O(|E|+|V|\sqrt{\log C})</math> .{{sfn|Ahuja|Mehlhorn|Orlin|Tarjan|1990}}. Finally, the best algorithms in this special case run in <math>O(|E|\log\log|V|)</math>{{sfn|Thorup|2000}} time and <math>O(|E| + |V|\min\{(\log|V|)^{1/3+\varepsilon}, (\log C)^{1/4+\varepsilon}\})</math> time.{{sfn|Raman|1997}}
 
== Related problems and algorithms ==
Dijkstra's original algorithm can be extended with modifications. For example, sometimes it is desirable to present solutions which are less than mathematically optimal. To obtain a ranked list of less-than-optimal solutions, the optimal solution is first calculated. A single edge appearing in the optimal solution is removed from the graph, and the optimum solution to this new graph is calculated. Each edge of the original solution is suppressed in turn and a new shortest-path calculated. The secondary solutions are then ranked and presented after the first optimal solution.
 
Dijkstra's algorithm is usually the working principle behind [[Linklink-state routing protocol|link-state routing protocols]]s. [[OSPF]] and [[IS-IS]] are the most common.
 
Unlike Dijkstra's algorithm, the [[Bellman–Ford algorithm]] can be used on graphs with negative edge weights, as long as the graph contains no [[negative cycle]] reachable from the source vertex ''s''. The presence of such cycles means that no shortest path can be found, since the label becomes lower each time the cycle is traversed. (This statement assumes that a "path" is allowed to repeat vertices. In [[graph theory]] that is normally not allowed. In [[theoretical computer science]] it often is allowed.) It is possible to adapt Dijkstra's algorithm to handle negative weights by combining it with the Bellman-Ford algorithm (to remove negative edges and detect negative cycles): [[Johnson's algorithm]].
Line 196 ⟶ 202:
 
In fact, Dijkstra's explanation of the logic behind the algorithm:{{sfn|Dijkstra|1959|p=270}}
{{blockquote|'''Problem 2.''' Find the path of minimum total length between two given nodes <math>{{mvar|P</math>}} and <math>{{mvar|Q</math>}}.
 
We use the fact that, if <math>{{mvar|R</math>}} is a node on the minimal path from <math>{{mvar|P</math>}} to <math>{{mvar|Q</math>}}, knowledge of the latter implies the knowledge of the minimal path from <math>{{mvar|P</math>}} to <math>{{mvar|R</math>}}.}}
is a paraphrasing of [[Richard Bellman|Bellman's]] [[Bellman equation#Bellman's principle of optimality|Principleprinciple of Optimalityoptimality]] in the context of the shortest path problem.
 
== See also ==
Line 234 ⟶ 240:
* [http://blog.cleancoder.com/uncle-bob/2016/10/26/DijkstrasAlg.html Implementation of Dijkstra's algorithm using TDD], [[Robert Cecil Martin]], The Clean Code Blog
{{Edsger Dijkstra}}{{Graph traversal algorithms}}{{Optimization algorithms|combinatorial|state=autocollapse}}
 
[[Category:Edsger W. Dijkstra|Algorithm]]
[[Category:1959 in computing]]
[[Category:Graph algorithms]]
[[Category:Search algorithms]]
[[Category:Greedy algorithms]]
[[Category:Routing algorithms]]
[[Category:Combinatorial optimization]]