Decision tree learning: Difference between revisions

Content deleted Content added
Distinction between algorithm and model necessary here. An algorithm is a procedure whereas a model actually represents something.
OAbot (talk | contribs)
m Open access bot: url-access updated in citation with #oabot.
Line 83:
* CART (Classification And Regression Tree)<ref name="bfos" />
* OC1 (Oblique classifier 1). First method that created multivariate splits at each node.<ref>{{Cite journal | doi = 10.1613/jair.63 | last1 = Murthy | first1 = S. K. | year = 1994 | title = A System for Induction of Oblique Decision Trees | journal = Journal of Artificial Intelligence Research | volume = 2 | issue = 1 | pages = 1–32 | doi-access = free }}</ref>
* [[Chi-square automatic interaction detection]] (CHAID). Performs multi-level splits when computing classification trees.<ref>{{Cite journal | doi = 10.2307/2986296 | last1 = Kass | first1 = G. V. | year = 1980 | title = An exploratory technique for investigating large quantities of categorical data | jstor = 2986296| journal = Applied Statistics | volume = 29 | issue = 2| pages = 119–127 }}</ref><ref>{{Cite journal|last1=Biggs|first1=David|last2=De Ville|first2=Barry|last3=Suen|first3=Ed|date=1991|title=A method of choosing multiway partitions for classification and decision trees|url=https://doi.org/10.1080/02664769100000005|journal=Journal of Applied Statistics|volume=18|issue=1|pages=49–62|doi=10.1080/02664769100000005|bibcode=1991JApSt..18...49B |issn=0266-4763|url-access=subscription}}</ref><ref>Ritschard, G. (2013), '''"'''CHAID and Earlier Supervised Tree Methods", in J.J. McArdle and G. Ritschard (eds), ''Contemporary Issues in Exploratory Data Mining in the Behavioral Sciences'', Quantitative Methodology Series, New York: Routledge, pages 48-74. [https://www.researchgate.net/publication/315476407_CHAID_and_Earlier_Supervised_Tree_Methods Preprint]</ref>
* [[Multivariate adaptive regression splines|MARS]]: extends decision trees to handle numerical data better.
* Conditional Inference Trees. Statistics-based approach that uses non-parametric tests as splitting criteria, corrected for multiple testing to avoid overfitting. This approach results in unbiased predictor selection and does not require pruning.<ref name="Hothorn2006">{{Cite journal | doi = 10.1198/106186006X133933 | last1 = Hothorn | first1 = T.| last2 = Hornik | first2 = K.| last3 = Zeileis | first3 = A.| year = 2006 | title = Unbiased Recursive Partitioning: A Conditional Inference Framework | jstor = 27594202| journal = Journal of Computational and Graphical Statistics | volume = 15 | issue = 3| pages = 651–674 | citeseerx = 10.1.1.527.2935 | s2cid = 6074128 }}</ref><ref name="Strobl2009">{{Cite journal | doi = 10.1037/a0016973 | pmid = 19968396 | pmc = 2927982 | last1 = Strobl | first1 = C.| last2 = Malley | first2 = J.| last3 = Tutz | first3 = G.| year = 2009 | title = An Introduction to Recursive Partitioning: Rationale, Application and Characteristics of Classification and Regression Trees, Bagging and Random Forests | journal = Psychological Methods | volume = 14 | issue = 4| pages = 323–348 }}</ref>
Line 93:
Boosted ensembles of FDTs have been recently investigated as well, and they have shown performances comparable to those of other very efficient fuzzy classifiers.<ref name="Barsacchi2020">{{Cite journal | url=http://www.sciencedirect.com/science/article/pii/S0957417420302608 | doi=10.1016/j.eswa.2020.113436|
title=An analysis of boosted ensembles of binary fuzzy decision trees| journal=Expert Systems with Applications|
volume=154| year=2020| last1=Barsacchi| first1=M.|last2=Bechini| first2=A.|last3=Marcelloni| first3=F.| page=113436| s2cid=216369273| url-access=subscription}}</ref>