Content deleted Content added
m clean up, typo(s) fixed: , → , using AWB |
|||
Line 7:
==Assumptions==
The assumptions of discriminant analysis are the same as those for MANOVA. The analysis is quite sensitive to outliers and the size of the smallest group must be larger than the number of predictor variables.<ref name="buy"/>
*[[Multivariate normal distribution|Multivariate normality]]: Independent variables are normal for each level of the grouping variable.<ref name="green"/><ref name="buy"/>
Line 21:
==Discriminant functions==
Discriminant analysis works by creating one or more linear combinations of predictors, creating a new [[latent variable]] for each function. These functions are called discriminant functions. The number of functions possible is either ''Ng''-1 where ''Ng'' = number of groups, or ''p'' (the number of predictors), whichever is smaller. The first function created maximizes the differences between groups on that function. The second function maximizes differences on that function, but also must not be correlated with the previous function. This continues with subsequent functions with the requirement that the new function not be correlated with any of the previous functions.
Given group <math>j</math>, with <math>\mathbb{R}_j</math> sets of sample space, there is a discriminant rule such that if <math>x \in\mathbb{R}_j</math>, then <math>x\in j</math>. Discriminant analysis then, finds “good” regions of <math>\mathbb{R}_j</math> to minimize classification error, therefore leading to a high percent correct classified in the classification table.Hardle, W., Simar, L. (2007). ''Applied Multivariate Statistical Analysis''. Springer Berlin Heidelberg. pp.
Each function is given a discriminant score to determine how well it predicts group placement.
Line 34:
*[[Maximum likelihood]]: Assigns x to the group that maximizes population (group) density.<ref name="har">Hardle, W., Simar, L. (2007). ''Applied Multivariate Statistical Analysis''. Springer Berlin Heidelberg. pp. 289-303.</ref>
*Bayes Discriminant Rule: Assigns x to the group that maximizes <math>\pi_i f_i(x)</math>, where <math>f_i(x)</math> represents the [[prior probability]] of that classification, and ''π<sub>i</sub>'' represents the population density.<ref name="har"/>
*[[Linear Discriminant Analysis|Fisher’s linear discriminant rule]]: Maximizes the ratio between ''SS''<sub>between</sub> and ''SS''<sub>within</sub>
==Eigenvalues==
Line 58:
==See also==
{{wikiversity}}▼
*[[Statistical classification]]
*[[Linear discriminant analysis]]
Line 68 ⟶ 67:
==External links==
▲{{wikiversity}}
* [http://www2.chass.ncsu.edu/garson/pa765/discrim.htm Course notes, Discriminant function analysis by G. David Garson, NC State University]
* [http://people.revoledu.com/kardi/tutorial/LDA/ Discriminant analysis tutorial in Microsoft Excel by Kardi Teknomo]
|