Optimal discriminant analysis and classification tree analysis: Difference between revisions

Content deleted Content added
No edit summary
There was a Script warning on the page from a cite book template, "Category:CS1 maint: date and year", fixed by removing the redundant date= field as year= was already set
 
(39 intermediate revisions by 19 users not shown)
Line 1:
{{NofootnotesNo footnotes|date=September 2009}}
 
'''Optimal Discriminant Analysis''' ('''ODA''')<ref>Provider: John Wiley & Sons, Ltd
'''Optimal discriminant analysis (ODA)''' and the related '''classification tree analysis (CTA)''' are statistical methods that maximize predictive accuracy. For any specific sample and exploratory or confirmatory hypothesis, optimal discriminant analysis (ODA) identifies the statistical model that yields maximum predictive accuracy, assesses the exact Type I error rate, and evaluates potential cross-generalizability. Optimal discriminant analysis may be applied to >&nbsp;0 dimensions, with the one-dimensional case being referred to as UniODA and the multidimensional case being referred to as MultiODA. Classification tree analysis is a generalization of Optimal Discriminant Analysis to non-orthogonal trees. Optimal Discriminant Analysis and Classification Tree Analysis may be used to find the combination of variables and cut points that best separate classes of objects or events. These variables and cut points may then be used to reduce dimensions and to then build a statistical model that optimally describes the data.
Content:text/plain; charset="UTF-8"
 
TY - JOUR
Optimal discriminant analysis may be thought of as a generalization of Fisher's [[linear discriminant analysis]]. In the case Optimal discriminant analysis is an alternative to [[analysis of variance|ANOVA]] (analysis of variance) and [[regression analysis]], which attempt to express one [[dependent variable]] as a linear combination of other features or measurements. However, ANOVA and regression analysis give a dependent variable that is a numerical variable, while optimal discriminant analysis gives a dependent variable that is a class variable.
AU - Yarnold, Paul R.
AU - Soltysik, Robert C.
TI - Theoretical Distributions of Optima for Univariate Discrimination of Random Data*
JO - Decision Sciences
VL - 22
IS - 4
PB - Blackwell Publishing Ltd
SN - 1540-5915
UR - https://dx.doi.org/10.1111/j.1540-5915.1991.tb00362.x
DO - 10.1111/j.1540-5915.1991.tb00362.x
SP - 739
EP - 752
KW - Discrete Programming
KW - Linear Statistical Models
KW - Mathematical Programming
KW - and Statistical Techniques
PY - 1991
'''OptimalER discriminant analysis (ODA)'''-1.tb00362.x</ref> and the related '''classification tree analysis''' ('''CTA)''') are exact statistical methods that maximize predictive accuracy. For any specific sample and exploratory or confirmatory hypothesis, optimal discriminant analysis (ODA) identifies the statistical model that yields maximum predictive accuracy, assesses the exact [[Type I error]] rate, and evaluates potential cross-generalizability. Optimal discriminant analysis may be applied to >&nbsp;0 dimensions, with the one-dimensional case being referred to as UniODA and the multidimensional case being referred to as MultiODA. Classification treeOptimal discriminant analysis is aan generalization of Optimal Discriminant Analysisalternative to non-orthogonal trees. Optimal Discriminant Analysis and Classification Tree Analysis may be used to find the combination[[analysis of variablesvariance|ANOVA]] and cut points that best separate classes(analysis of objects or events. These variablesvariance) and cut[[regression points may then be used to reduce dimensions and to then build a statistical model that optimally describes the dataanalysis]].
 
==See also==
Line 21 ⟶ 40:
== References ==
<references/>
 
== Notes ==
* {{cite book
| title=Optimal Data Analysis
|first1=Paul R.
|last1=Yarnold
|first2=Robert C.
|last2=Soltysik
|publisher=American PsychologiclaPsychological Association
|isbn=978-1-55798-981-89
|year=2004
|url=http://books.apa.org/books.cfm?id=4316000
|access-date=2009-09-11
}}
|archive-url=https://web.archive.org/web/20081123105843/http://books.apa.org/books.cfm?id=4316000
|archive-date=2008-11-23
|url-status=dead
}}
* {{cite journal
|last=Fisher |first=R. A. |authorlink=Ronald Fisher
Line 37 ⟶ 64:
|pages=179–188
|year=1936
|hdl=2440/15227|doi=10.1111/j.1469-1809.1936.tb02137.x
|url=http://hdl.handle.net/2440/15227 |accessdate=2009-05-09 |format=PDF
|issue=2
}}
|hdl-access=free
* {{cite journal
}}
|last1=Martinez |first1=A. M. |last2=Kak |first2=A. C.
* {{cite journal |last1=Martinez |first1=A. M. |last2=Kak |first2=A. C. |title=PCA versus LDA |journal=[[IEEE Transactions on Pattern Analysis and Machine Intelligence]] |volume=23 |issue=2 |pages=228&ndash;233 |year=2001 |url=http://www.ece.osu.edu/~aleix/pami01f.pdf |doi=10.1109/34.908974 }}{{Dead link|date=April 2020 |bot=InternetArchiveBot |fix-attempted=yes }}
|title=PCA versus LDA
* {{cite journalbook
|journal=[[IEEE Transactions on Pattern Analysis and Machine Intelligence]]
|author=Mika, S. et al.
|volume=23 |issue=2 |pages=228&ndash;233
|chapter=Fisher discriminant analysis with kernels
|year=2001
|title=Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468)
|url=http://www.ece.osu.edu/~aleix/pami01f.pdf
|doi=10.1109/34.908974
}}
* {{cite journal
|author=Mika, S. et al.
|title=Fisher Discriminant Analysis with Kernels
|journal=IEEE Conference on Neural Networks for Signal Processing IX
|year=1999
|pages=41&ndash;48
|doi=10.1109/NNSP.1999.788121
|display-authors=etal|isbn=978-0-7803-5673-3
|url=http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.35.9904
|citeseerx=10.1.1.35.9904
}}
|s2cid=8473401
}}
 
== External links ==
*[http://people.revoledu.com/kardi/tutorial/LDA/index.html LDA tutorial using MS Excel]
*[https://web.archive.org/web/20140526130544/http://www.vniroguewave.com/portals/0/products/imsl-numerical-libraries/documentationfortran-library/fort06docs/7.0/stat/NetHelp/default.htm?turl=dscrmstat.htm IMSL discriminant analysis function DSCRM], which has many useful mathematical definitions.
 
[[Category:Multivariate statistics]]
[[Category:Statistical classification]]
[[Category:Classification algorithms]]
[[Category:Psychometrics]]
[[Category:Market research]]
[[Category:Marketing]]
[[Category:Consumer behaviour]]
 
[[de:Diskriminanzanalyse]]
[[eo:Vikipedio:Projekto matematiko/Lineara diskriminanta analitiko]]
[[fa:تحلیل تفکیک خطی]]
[[fr:Analyse discriminante linéaire]]
[[hr:Linearna analiza različitih]]