Optimal discriminant analysis and classification tree analysis: Difference between revisions
Content deleted Content added
No edit summary |
There was a Script warning on the page from a cite book template, "Category:CS1 maint: date and year", fixed by removing the redundant date= field as year= was already set |
||
(40 intermediate revisions by 19 users not shown) | |||
Line 1:
{{
'''Optimal Discriminant Analysis''' ('''ODA''')<ref>Provider: John Wiley & Sons, Ltd
'''Optimal discriminant analysis (ODA)''' and the related '''Classification tree analysis''' (CTA) are statistical methods that maximize predictive accuracy. For any specific sample and exploratory or confirmatory hypothesis, optimal discriminant analysis (ODA) identifies the statistical model that yields maximum predictive accuracy, assesses the exact Type I error rate, and evaluates potential cross-generalizability. Optimal discriminant analysis may be applied to > 0 dimensions, with the one-dimensional case being referred to as UniODA and the multidimensional case being referred to as MultiODA. Classification Tree Analysis is a generalization of Optimal Discriminant Analysis to non-orthogonal trees. Optimal Discriminant Analysis and Classification Tree Analysis may be used to find the combination of variables and cut points that best separate classes of objects or events. These variables and cut points may then be used to reduce dimensions and to then build a statistical model that optimally describes the data.▼
Content:text/plain; charset="UTF-8"
TY - JOUR
AU - Yarnold, Paul R.
AU - Soltysik, Robert C.
TI - Theoretical Distributions of Optima for Univariate Discrimination of Random Data*
JO - Decision Sciences
VL - 22
IS - 4
PB - Blackwell Publishing Ltd
SN - 1540-5915
UR - https://dx.doi.org/10.1111/j.1540-5915.1991.tb00362.x
DO - 10.1111/j.1540-5915.1991.tb00362.x
SP - 739
EP - 752
KW - Discrete Programming
KW - Linear Statistical Models
KW - Mathematical Programming
KW - and Statistical Techniques
PY - 1991
▲
==See also==
Line 21 ⟶ 40:
== References ==
<references/>
== Notes ==
* {{cite book
|
|first1=Paul R.
|last1=Yarnold |first2=Robert C.
|last2=Soltysik |publisher=American
|isbn=978-1-55798-981-
|year=2004
|url=http://books.apa.org/books.cfm?id=4316000
|access-date=2009-09-11
}}▼
|archive-url=https://web.archive.org/web/20081123105843/http://books.apa.org/books.cfm?id=4316000
|archive-date=2008-11-23
|url-status=dead
▲ }}
* {{cite journal
|last=Fisher |first=R. A. |authorlink=Ronald Fisher
Line 37 ⟶ 64:
|pages=179–188
|year=1936
|hdl=2440/15227|doi=10.1111/j.1469-1809.1936.tb02137.x
|issue=2
}}▼
|hdl-access=free
* {{cite journal▼
▲ }}
* {{cite journal |last1=Martinez |first1=A. M. |last2=Kak |first2=A. C. |title=PCA versus LDA |journal=[[IEEE Transactions on Pattern Analysis and Machine Intelligence]] |volume=23 |issue=2 |pages=228–233 |year=2001 |url=http://www.ece.osu.edu/~aleix/pami01f.pdf |doi=10.1109/34.908974 }}{{Dead link|date=April 2020 |bot=InternetArchiveBot |fix-attempted=yes }}
|chapter=Fisher discriminant analysis with kernels
|title=Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468)
}}▼
▲ |author=Mika, S. et al.
|year=1999
|pages=41–48
|doi=10.1109/NNSP.1999.788121
|display-authors=etal|isbn=978-0-7803-5673-3
|citeseerx=10.1.1.35.9904
|s2cid=8473401
▲ }}
== External links ==
*[http://people.revoledu.com/kardi/tutorial/LDA/index.html LDA tutorial using MS Excel]
*[https://web.archive.org/web/20140526130544/http://www.
[[Category:Classification algorithms]]
[[de:Diskriminanzanalyse]]
[[eo:Vikipedio:Projekto matematiko/Lineara diskriminanta analitiko]]
[[fr:Analyse discriminante linéaire]]
[[hr:Linearna analiza različitih]]
|