Optimal discriminant analysis and classification tree analysis: Difference between revisions

Content deleted Content added
KolbertBot (talk | contribs)
m Bot: HTTP→HTTPS (v485)
There was a Script warning on the page from a cite book template, "Category:CS1 maint: date and year", fixed by removing the redundant date= field as year= was already set
 
(13 intermediate revisions by 7 users not shown)
Line 1:
{{No footnotes|date=September 2009}}
 
'''Optimal Discriminant Analysis''' ('''ODA)''' )<ref>Provider: John Wiley & Sons, Ltd
Content:text/plain; charset="UTF-8"
 
Line 22:
KW - and Statistical Techniques
PY - 1991
ER -1.tb00362.x</ref> and the related '''classification tree analysis''' ('''CTA)''') are exact statistical methods that maximize predictive accuracy. For any specific sample and exploratory or confirmatory hypothesis, optimal discriminant analysis (ODA) identifies the statistical model that yields maximum predictive accuracy, assesses the exact [[Type I error]] rate, and evaluates potential cross-generalizability. Optimal discriminant analysis may be applied to >&nbsp;0 dimensions, with the one-dimensional case being referred to as UniODA and the multidimensional case being referred to as MultiODA. ClassificationOptimal treediscriminant analysis is aan generalization of optimal discriminant analysisalternative to non-orthogonal trees. Classification tree [[analysis hasof more recently been called "hierarchical optimal discriminantvariance|ANOVA]] (analysis". of Optimal discriminant analysisvariance) and classification tree[[regression analysis may be used to find the combination of variables and cut points that best separate classes of objects or events. These variables and cut points may then be used to reduce dimensions and to then build a statistical model that optimally describes the data]].
 
Optimal discriminant analysis may be thought of as a generalization of Fisher's [[linear discriminant analysis]]. Optimal discriminant analysis is an alternative to [[analysis of variance|ANOVA]] (analysis of variance) and [[regression analysis]], which attempt to express one [[dependent variable]] as a linear combination of other features or measurements. However, ANOVA and regression analysis give a dependent variable that is a numerical variable, while optimal discriminant analysis gives a dependent variable that is a class variable.
 
==See also==
Line 42 ⟶ 40:
== References ==
<references/>
 
== Notes ==
* {{cite book
| title=Optimal Data Analysis
|first1=Paul R.
|last1=Yarnold
|first2=Robert C.
|last2=Soltysik
|publisher=American Psychological Association
|isbn=978-1-55798-981-89
|year=2004
|url=http://books.apa.org/books.cfm?id=4316000
|access-date=2009-09-11
}}
|archive-url=https://web.archive.org/web/20081123105843/http://books.apa.org/books.cfm?id=4316000
|archive-date=2008-11-23
|url-status=dead
}}
* {{cite journal
|last=Fisher |first=R. A. |authorlink=Ronald Fisher
Line 60 ⟶ 66:
|hdl=2440/15227|doi=10.1111/j.1469-1809.1936.tb02137.x
|issue=2
|hdl-access=free
}}
* {{cite journal
* {{cite journal |last1=Martinez |first1=A. M. |last2=Kak |first2=A. C. |title=PCA versus LDA |journal=[[IEEE Transactions on Pattern Analysis and Machine Intelligence]] |volume=23 |issue=2 |pages=228&ndash;233 |year=2001 |url=http://www.ece.osu.edu/~aleix/pami01f.pdf |doi=10.1109/34.908974 }}{{Dead link|date=April 2020 |bot=InternetArchiveBot |fix-attempted=yes }}
|last1=Martinez |first1=A. M. |last2=Kak |first2=A. C.
* {{cite journalbook
|title=PCA versus LDA
|author=Mika, S.
|journal=[[IEEE Transactions on Pattern Analysis and Machine Intelligence]]
|chapter=Fisher discriminant analysis with kernels
|volume=23 |issue=2 |pages=228&ndash;233
|title=Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468)
|year=2001
|url=http://www.ece.osu.edu/~aleix/pami01f.pdf
|doi=10.1109/34.908974
}}
* {{cite journal
|author=Mika, S.|title=Fisher Discriminant Analysis with Kernels
|journal=IEEE Conference on Neural Networks for Signal Processing IX
|year=1999
|pages=41&ndash;48
|doi=10.1109/NNSP.1999.788121
|display-authors=etal|isbn=978-0-7803-5673-X3
|url=http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.35.9904
|citeseerx=10.1.1.35.9904
|display-authors=etal|isbn=0-7803-5673-X
|s2cid=8473401
}}
 
== External links ==
*[http://people.revoledu.com/kardi/tutorial/LDA/index.html LDA tutorial using MS Excel]
*[https://web.archive.org/web/20140526130544/http://www.roguewave.com/Portalsportals/0/products/imsl-numerical-libraries/fortran-library/docs/7.0/stat/stat.htm IMSL discriminant analysis function DSCRM], which has many useful mathematical definitions.
 
[[Category:Statistical classification]]
[[Category:Classification algorithms]]
[[Category:Psychometrics]]
[[Category:Quantitative marketing research]]
 
[[de:Diskriminanzanalyse]]