Content deleted Content added
→Details: This is not convolution, but a matrix vector product. |
|||
Line 40:
Finally, a [[linear combination]] of the eigenvectors is used to define a new shape <math>\mathbf{X}'</math>, mathematically defined as:
:<math>\mathbf{X}' = \overline{\mathbf{X}} + \mathbf{P}
where <math>\overline{\mathbf{X}}</math> is defined as the mean shape across all training images, and <math>\mathbf{b}</math> is a vector of scaling values for each principal component. Therefore, by modifying the variable <math>\mathbf{b}</math> an infinite number of shapes can be defined. To ensure that the new shapes are all within the variation seen in the training set, it is common to only allow each element of <math>\mathbf{b}</math> to be within <math>\pm</math>3 standard deviations, where the standard deviation of a given principal component is defined as the square root of its corresponding eigenvalue.
|