Decision tree learning: Difference between revisions

Content deleted Content added
AVM2019 (talk | contribs)
m Gini impurity: Improve phrasing a bit more
Undid revision 1132951324
Tags: Undo Reverted
Line 68:
|isbn=978-0-412-04841-8
}}</ref> Trees used for regression and trees used for classification have some similarities – but also some differences, such as the procedure used to determine where to split.<ref name="bfos"/>
 
*'''Decision stream''' avoids the problems of data exhaustion and formation of unrepresentative data samples in decision tree nodes by merging the leaves from the same and/or different levels of predictive model structure. With increasing the number of samples in nodes and reducing the tree width, decision stream preserves statistically representative data and allows extremely deep graph architecture that can consist of hundreds of levels.<ref>{{cite journal|author1=Ignatov, D.Yu.|author2=Ignatov, A.D.|title=Decision Stream: Cultivating Deep Decision Trees|journal=IEEE Ictai|pages=905–912|doi=10.1109/ICTAI.2017.00140|date=2017|arxiv=1704.07657|bibcode=2017arXiv170407657I|isbn=978-1-5386-3876-7|s2cid=21864203|url=https://arxiv.org/pdf/1704.07657.pdf}}</ref>
 
Some techniques, often called ''ensemble'' methods, construct more than one decision tree: