Optimal discriminant analysis and classification tree analysis: Difference between revisions
Content deleted Content added
m →References: Task 7c: repair/replace et al. in cs1 author/editor parameters; |
There was a Script warning on the page from a cite book template, "Category:CS1 maint: date and year", fixed by removing the redundant date= field as year= was already set |
||
(26 intermediate revisions by 11 users not shown) | |||
Line 1:
{{No footnotes|date=September 2009}}
'''Optimal Discriminant Analysis''' ('''ODA''')<ref>Provider: John Wiley & Sons, Ltd
'''Optimal Discriminant Analysis (ODA)''' and the related '''classification tree analysis (CTA)''' are exact statistical methods that maximize predictive accuracy. For any specific sample and exploratory or confirmatory hypothesis, optimal discriminant analysis (ODA) identifies the statistical model that yields maximum predictive accuracy, assesses the exact [[Type I error]] rate, and evaluates potential cross-generalizability. Optimal discriminant analysis may be applied to > 0 dimensions, with the one-dimensional case being referred to as UniODA and the multidimensional case being referred to as MultiODA. Classification tree analysis is a generalization of optimal discriminant analysis to non-orthogonal trees. Classification tree analysis has more recently been called "hierarchical optimal discriminant analysis". Optimal discriminant analysis and classification tree analysis may be used to find the combination of variables and cut points that best separate classes of objects or events. These variables and cut points may then be used to reduce dimensions and to then build a statistical model that optimally describes the data.▼
Content:text/plain; charset="UTF-8"
TY - JOUR
AU - Yarnold, Paul R.
AU - Soltysik, Robert C.
TI - Theoretical Distributions of Optima for Univariate Discrimination of Random Data*
JO - Decision Sciences
VL - 22
IS - 4
PB - Blackwell Publishing Ltd
SN - 1540-5915
UR - https://dx.doi.org/10.1111/j.1540-5915.1991.tb00362.x
DO - 10.1111/j.1540-5915.1991.tb00362.x
SP - 739
EP - 752
KW - Discrete Programming
KW - Linear Statistical Models
KW - Mathematical Programming
KW - and Statistical Techniques
PY - 1991
▲
==See also==
Line 21 ⟶ 40:
== References ==
<references/>
== Notes ==
* {{cite book
|
|first1=Paul R.
|last1=Yarnold |first2=Robert C.
|last2=Soltysik |publisher=American Psychological Association
|isbn=978-1-55798-981-
|year=2004
|url=http://books.apa.org/books.cfm?id=4316000
|access-date=2009-09-11
}}▼
|archive-url=https://web.archive.org/web/20081123105843/http://books.apa.org/books.cfm?id=4316000
|archive-date=2008-11-23
|url-status=dead
▲ }}
* {{cite journal
|last=Fisher |first=R. A. |authorlink=Ronald Fisher
Line 37 ⟶ 64:
|pages=179–188
|year=1936
|
|issue=2
|hdl-access=free
}}
* {{cite journal▼
* {{cite journal |last1=Martinez |first1=A. M. |last2=Kak |first2=A. C. |title=PCA versus LDA |journal=[[IEEE Transactions on Pattern Analysis and Machine Intelligence]] |volume=23 |issue=2 |pages=228–233 |year=2001 |url=http://www.ece.osu.edu/~aleix/pami01f.pdf |doi=10.1109/34.908974 }}{{Dead link|date=April 2020 |bot=InternetArchiveBot |fix-attempted=yes }}
|author=Mika, S.
|chapter=Fisher discriminant analysis with kernels
|title=Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468)
}}▼
|year=1999
|pages=41–48
|doi=10.1109/NNSP.1999.788121
|citeseerx=10.1.1.35.9904
▲|display-authors=etal}}
|s2cid=8473401
▲ }}
== External links ==
*[http://people.revoledu.com/kardi/tutorial/LDA/index.html LDA tutorial using MS Excel]
*[https://web.archive.org/web/20140526130544/http://www.roguewave.com/
[[Category:Classification algorithms]]
[[de:Diskriminanzanalyse]]
|