== Algorithm ==
[[Image:Douglas-Peucker animated.gif|thumb|right|Simplifying a piecewise linear curve with the Douglas–Peucker algorithm.]]
The starting curve is an ordered set of points or lines and the distance dimension {{math|''εε'' > 0}}.
<!--different implementation: The algorithm uses an array of boolean flags, initially set to not-kept, one for each point.-->
The algorithm [[recursion|recursively]] divides the line. Initially it is given all the points between the first and last point. It automatically marks the first and last point to be kept. It then finds the point that is farthest from the line segment with the first and last points as end points; this point is obviously farthest on the curve from the approximating line segment between the end points. If the point is closer than ''ε''{{mvar|ε}} to the line segment, then any points not currently marked to be kept can be discarded without the simplified curve being worse than ''ε''{{mvar|ε}}.
If the point farthest from the line segment is greater than ''ε''{{mvar|ε}} from the approximation then that point must be kept. The algorithm recursively calls itself with the first point and the farthest point and then with the farthest point and the last point, which includes the farthest point being marked as kept.
When the recursion is completed a new output curve can be generated consisting of all and only those points that have been marked as kept.
=== Non-parametric Ramer–Douglas–Peucker ===
The choice of ''ε''{{mvar|ε}} is usually user-defined. Like most line fitting /, polygonal approximation /or dominant point detection methods, it can be made non-parametric by using the error bound due to digitization /and quantization as a termination condition.<ref>{{cite journal |last1 = Prasad |first1 = Dilip K. |first2 = Maylor K.H. |last2 = Leung |first3 = Chai |last3 = Quek |first4 = Siu-Yeung |last4 = Cho |title = A novel framework for making dominant point detection methods non-parametric |journal = Image and Vision Computing |year = 2012 |volume = 30 |issue = 11 |pages = 843–859 |doi = 10.1016/j.imavis.2012.06.010 }}</ref>
=== Pseudocode ===
(AssumesAssuming the input is a one-based array):
<syntaxhighlight lang="python">
# source: https://karthaus.nl/rdp/
== Complexity ==
The running time of this algorithm when run on a polyline consisting of <{{math>|''n-'' – 1</math>}} segments and <math>{{mvar|n</math>}} vertices is given by the recurrence <{{math>|''T''(''n'') {{=}} ''T''(''i'' + 1) + ''T''(''n-'' − ''i'') + [[Big O notation|''O''(''n'')</math>]]}} where <{{math>|''i\in\'' {{=}} 1,\ldots 2,..., ''n-'' − 2\}</math>} is the value of ''<code>index''</code> in the pseudocode. In the worst case, <{{math>|''i'' {{=}} 1</math>}} or <{{math>|''i'' {{=}} ''n-'' − 2</math>}} at each recursive invocation and this algorithm has a running time of <{{math>\Theta|[[Big theta|''Θ''(''n^''<sup>2)</mathsup>)]]}}. In the best case <{{math>|''i'' {{=\lfloor}} {{sfrac|''n/''|2\rfloor</math>}}}} (or <{{math>|''i'' {{=\lceil}} {{sfrac|''n/'' ± 1|2\rceil</math>}}}}) at each recursive invocation in which case the running time has the well-known solution (via the [[master theorem (analysis of algorithms)|master theorem for divide-and-conquer recurrences]]) of <{{math>|''O''(''n\'' log ''n'')</math>}}.
Using (fully- or semi-) [[Dynamicdynamic convex hull]] data structures, the simplification performed by the algorithm can be accomplished in <{{math>|''O''(''n\'' log ''n'')</math>}} time.<ref>{{cite techreport |last1 = Hershberger |first1 = John |first2 = Jack |last2 = Snoeyink |title = Speeding Up the Douglas-Peucker Line-Simplification Algorithm |date = 1992 | url = http://www.bowdoin.edu/~ltoma/teaching/cs350/spring06/Lecture-Handouts/hershberger92speeding.pdf}}</ref>
==Similar algorithms==
* [[Opheim simplification algorithm|Opheim simplification]]
* [[Lang simplification algorithm|Lang simplification]]
* [[Zhao SaalfeldZhao–Saalfeld algorithm|Zhao-SaalfeldZhao–Saalfeld]]
== See also ==
== Further reading ==
{{refbegin}}
* {{cite journal|first=Urs |last=Ramer, "|title=An iterative procedure for the polygonal approximation of plane curves", |journal=Computer Graphics and Image Processing, |volume=1( |issue=3), |page=244–256 (|date=1972) {{|doi|=10.1016/S0146-664X(72)80017-0}}
* {{cite journal|first1=David |last1=Douglas & |first2=Thomas |last2=Peucker, "|title=Algorithms for the reduction of the number of points required to represent a digitized line or its caricature", |journal=The Canadian Cartographer |volume=10( |issue=2), |page=112–122 (|date=1973) {{|doi|=10.3138/FM57-6770-U75U-7727}}
* {{cite conference|first1=John |last1=Hershberger & |first2=Jack |last2=Snoeyink, "|title=Speeding Up the Douglas–Peucker Line-Simplification Algorithm", Proc|conference=Proceedings of the 5th SympSymposium on Data Handling, |page=134–143 (|date=1992).}} UBC Tech Report TR-92-07 available at http://www.cs.ubc.ca/cgi-bin/tr/1992/TR-92-07
* {{cite book|first1=R.O. |last1=Duda and |first2=P.E. |last2=Hart, "|title=Pattern classificationClassification and sceneScene analysis",Analysis (|date=1973), |publisher=Wiley, |___location=New York (|archive-url=https://web.archive.org/web/20110715184521/http://rii.ricoh.com/~stork/DHS.html) |archive-date=2011-07-15 |url=http://rii.ricoh.com/~stork/DHS.html}}
* {{cite techreport |title = Line Generalisation by Repeated Elimination of the Smallest Area |series = Discussion Paper |number = 10 |first1 = M. |last1 = Visvalingam |first2 = JDJ.D. |last2 = Whyatt |year = 1992 |institution = Cartographic Information Systems Research Group (CISRG), The University of Hull |url = https://hydra.hull.ac.uk/assets/hull:8338/content }}
{{refend}}
== References ==
|