Decision tree learning: Difference between revisions

Content deleted Content added
OAbot (talk | contribs)
m Open access bot: doi updated in citation with #oabot.
m remove <em> tag
Line 71:
Some techniques, often called ''ensemble'' methods, construct more than one decision tree:
* '''[[Gradient boosted trees|Boosted trees]]''' Incrementally building an ensemble by training each new instance to emphasize the training instances previously mis-modeled. A typical example is [[AdaBoost]]. These can be used for regression-type and classification-type problems.<ref>Friedman, J. H. (1999). ''[https://astro.temple.edu/~msobel/courses_files/StochasticBoosting(gradient).pdf Stochastic gradient boosting] {{Webarchive|url=https://web.archive.org/web/20181128041212/https://astro.temple.edu/~msobel/courses_files/StochasticBoosting(gradient).pdf |date=2018-11-28 }}.'' Stanford University.</ref><ref>Hastie, T., Tibshirani, R., Friedman, J. H. (2001). ''The elements of statistical learning : Data mining, inference, and prediction.'' New York: Springer Verlag.</ref>
* '''Committees of decision trees''' (also called k-DT<ref>Heath, D., Kasif, S. and Salzberg, S. (1993). ''k-DT: A multi-tree learning method.'' In <em>''Proceedings of the Second Intl. Workshop on Multistrategy Learning</em>'', pp. 138-149.</ref>), an early method that used randomized decision tree algorithms to generate multiple different trees from the training data, and then combine them using majority voting to generate output.<ref>Heath, D., Kasif, S., and Salzberg, S. L. (1996). ''Committees of decision trees.'' In B. Gorayska and J. Mey (Eds.), <em>''Cognitive Technology: In Search of a Humane Interface</em>'' (pp. 305–317). Amsterdam: Elsevier Science B.V.</ref>
* '''[[Bootstrap aggregating|Bootstrap aggregated]]''' (or bagged) decision trees, an early ensemble method, builds multiple decision trees by repeatedly [[Bootstrapping (statistics)|resampling training data with replacement]], and voting the trees for a consensus prediction.<ref>{{cite journal |last=Breiman |first=L. |year=1996 |title=Bagging Predictors |journal=Machine Learning |volume=24 |issue=2 |pages=123–140 |doi=10.1007/BF00058655 |doi-access=free }}</ref>
** A '''[[random forest]]''' classifier is a specific type of [[bootstrap aggregating]]