Content deleted Content added
m v2.05b - Bot T13 CW#549 - Fix errors for CW project (Split link - Reference before punctuation) |
ChandlerNick (talk | contribs) |
||
Line 288:
===The Kabsch algorithm===
The [[Kabsch algorithm]] (called [[Wahba's problem]] in other fields) uses SVD to compute the optimal rotation (with respect to least-squares minimization) that will align a set of points with a corresponding set of points. It is used, among other applications, to compare the structures of molecules.
===Principal Components Analysis===
The SVD can be used to construct the principal components<ref>{{cite book |last=Hastie |first=Trevor |author2=Robert Tibshirani |author3=Jerome Friedman |title=The Elements of Statistical Learning |edition=2nd |year=2009 |publisher=Springer |___location=New York |pages=535–536 |isbn=978-0-387-84857-0}}</ref> in [[Principal component analysis|principal component analysis]] as follows:
Let <math>\mathbf{X} \in \mathbb{R}^{N \times p}</math> be a data matrix where each of the <math>N</math> rows is a (feature-wise) mean-centered observation, each of dimension <math>p</math>.
The SVD of <math>\mathbf{X}</math> is:
<math display="block">
\mathbf{X} = \mathbf{V} \boldsymbol{\Sigma} \mathbf{U}^\ast
</math>
From the same reference,<ref>{{cite book |last=Hastie |first=Trevor |author2=Robert Tibshirani |author3=Jerome Friedman |title=The Elements of Statistical Learning |edition=2nd |year=2009 |publisher=Springer |___location=New York |pages=535–536 |isbn=978-0-387-84857-0}}</ref> we see that <math>\mathbf{V} \boldsymbol{\Sigma}</math> contains the scores of the rows of <math>\mathbf{X}</math> (i.e. each observation), and <math>\mathbf{U}</math> is the matrix whose columns are principal component loading vectors.
===Signal processing===
|