Content deleted Content added
m Bot: link syntax |
Runner1928 (talk | contribs) {{Cite book}} |
||
Line 1:
[[Image:Shortest path optimal substructure.svg|200px|thumb|'''Figure 1'''. Finding the shortest path using optimal substructure. Numbers represent the length of the path; straight lines indicate single [[Edge (graph theory)|edges]], wavy lines indicate shortest [[Path (graph theory)|paths]], i.e., there might be other vertices that are not shown here.]]
In [[computer science]], a problem is said to have '''optimal substructure''' if an optimal solution can be constructed efficiently from optimal solutions of its subproblems. This property is used to determine the usefulness of dynamic programming and greedy algorithms for a problem.<ref
Typically, a [[greedy algorithm]] is used to solve a problem with optimal substructure if it can be proved by induction that this is optimal at each step.<ref
<!-- A special case of optimal substructure is the case where a subproblem S<sub>ab</sub> has an activity P<sub>y</sub>, then it should contain optimal solutions to subproblems S<sub>ay</sub> and S<sub>yb</sub>. --> <!-- *TODO: Add Recursion, misc. -->
In the application of [[dynamic programming]] to [[Optimization (mathematics)|mathematical optimization]], [[Richard Bellman]]'s [[Principle of optimality|Principle of Optimality]] is based on the idea that in order to solve a dynamic optimization problem from some starting period ''t'' to some ending period ''T'', one implicitly has to solve subproblems starting from later dates ''s'', where ''t<s<T''. This is an example of optimal substructure. The Principle of Optimality is used to derive the [[Bellman equation]], which shows how the value of the problem starting from ''t'' is related to the value of the problem starting from ''s''.
|