Content deleted Content added
Correcting ambiguous statement "opens the road to" is unclear. |
|||
(35 intermediate revisions by 24 users not shown) | |||
Line 1:
The '''
==Background== The point distribution model |author = T. F. Cootes
|title = Statistical models of appearance for computer vision
|
}}</ref> Taylor ''et al.''<ref name=taylor>{{citation▼
▲|url=http://www.isbe.man.ac.uk/~bim/Models/app_models.pdf
▲}}</ref> Taylor ''et al''<ref name=taylor>{{citation
|authors = D.H. Cooper, T.F. Cootes, C.J. Taylor and J. Graham▼
|title = Active shape models—their training and application
|journal = Computer Vision and Image Understanding
|pages = 38–59
|year = 1995
▲|
}}</ref> and became a standard in [[computer vision]] for the [[statistical shape analysis|statistical study of shape]]<ref>{{
|title = Shape discrimination in the Hippocampus using an MDL Model
|authors = Rhodri H. Davies and Carole J. Twining and P. Daniel Allen and Tim F. Cootes and Chris J. Taylor▼
|year = 2003
|conference = IMPI
|url = http://www2.wiau.man.ac.uk/caws/Conferences/10/proceedings/8/papers/133/rhhd_ipmi03%2Epdf
▲|
}}</ref> and for segmentation of medical images<ref name=taylor/> where shape priors really help interpretation of noisy and low-contrasted pixels/voxels. The latter point leads to [[Active shape model]]s (ASM) and [[Active Appearance Model]]s (AAM).▼
|access-date = 2007-07-27
|archive-url = https://web.archive.org/web/20081008194350/http://www2.wiau.man.ac.uk/caws/Conferences/10/proceedings/8/papers/133/rhhd_ipmi03%2Epdf
|archive-date = 2008-10-08
|url-status = dead
▲}}</ref> and for [[image segmentation|segmentation]] of [[medical imaging|medical images]]<ref name=taylor/> where shape priors really help interpretation of noisy and low-contrasted
Point
==Details==
<math>k</math> aligned landmarks in two dimensions are given as
An eigenvector, interpreted in euclidean space, can be seen as a sequence of n euclidean vectors associated to corresponding landmark and designating a compound move for the whole shape. Global nonlinear variation is usually well handled provided nonlinear variation is kept to a reasonable level. Typically, a twisting nematode worm is used as an example in the teaching of [[kernel PCA]]-based methods.▼
:<math>\mathbf{X} = (x_1, y_1, \ldots, x_k, y_k)</math>.
Due to the PCA properties: eigenvectors are mutually orthogonal, form a basis of the training set cloud in the shape space, and cross at the 0 in this space, which represents the mean shape. Also, PCA is a traditional way of fitting a closed ellipsoid to a Gaussian cloud of points (whatever their dimension): this suggests the concept of bounded variation.▼
It's important to note that each landmark <math>i \in \lbrace 1, \ldots k \rbrace </math> should represent the same anatomical ___location. For example, landmark #3, <math>(x_3, y_3)</math> might represent the tip of the ring finger across all training images.
The very big idea of PDM is that eigenvectors can be linearly combined to create an infinity of new shape instances that will 'look like' the one in the training set. The coefficients are bounded alike the values of the corresponding eigenvalues, so as to ensure the generated 2n/3n-dimensional dot will remain into the hyper-ellipsoïdal allowed ___domain—[[Allowable Shape Domain]] (ASD).<ref name=taylor/>▼
Now the shape outlines are reduced to sequences of <math>k</math> landmarks, so that a given training shape is defined as the vector <math>\mathbf{X} \in \mathbb{R}^{2k}</math>. Assuming the scattering is [[gaussian distribution|gaussian]] in this space, PCA is used to compute normalized [[eigenvectors]] and [[eigenvalues]] of the [[covariance matrix]] across all training shapes. The matrix of the top <math>d</math> eigenvectors is given as <math>\mathbf{P} \in \mathbb{R}^{2k \times d}</math>, and each eigenvector describes a principal mode of variation along the set.
Finally, a [[linear combination]] of the eigenvectors is used to define a new shape <math>\mathbf{X}'</math>, mathematically defined as:
:<math>\mathbf{X}' = \overline{\mathbf{X}} + \mathbf{P} \mathbf{b}</math>
where <math>\overline{\mathbf{X}}</math> is defined as the mean shape across all training images, and <math>\mathbf{b}</math> is a vector of scaling values for each principal component. Therefore, by modifying the variable <math>\mathbf{b}</math> an infinite number of shapes can be defined. To ensure that the new shapes are all within the variation seen in the training set, it is common to only allow each element of <math>\mathbf{b}</math> to be within <math>\pm</math>3 standard deviations, where the standard deviation of a given principal component is defined as the square root of its corresponding eigenvalue.
PDM's can be extended to any arbitrary number of dimensions, but are typically used in 2D image and 3D volume applications (where each landmark point is <math>\mathbb{R}^2</math> or <math>\mathbb{R}^3</math>).
==Discussion==
▲An eigenvector, interpreted in [[euclidean space]], can be seen as a sequence of
▲Due to the PCA properties: eigenvectors are mutually [[orthogonal]], form a basis of the training set cloud in the shape space, and cross at the 0 in this space, which represents the mean shape. Also, PCA is a traditional way of fitting a closed ellipsoid to a Gaussian cloud of points (whatever their dimension): this suggests the concept of bounded variation.
▲The
==See also==▼
* [[Procrustes analysis]]▼
==References==
<references/>
<!-- Not referenced [4]{{citation
|authors = Stegmann, M. B. and Gomez, D. D.▼
|title = A Brief Introduction to Statistical Shape Analysis
|year = 2002
Line 45 ⟶ 68:
|url=http://www2.imm.dtu.dk/pubdb/views/publication_details.php?id=403
|quote=Images, annotations and data reports are placed in the enclosed zip-file.
|name-list-style=amp }} -->
▲==See also==
▲* [[Procrustes analysis]]
==External links==
* [https://web.archive.org/web/20080509041813/http://www.isbe.man.ac.uk/~bim/Models/index.html Flexible Models for Computer Vision], Tim Cootes, Manchester University.
* [http://www.icaen.uiowa.edu/~dip/LECTURE/Understanding3.html A practical introduction to PDM and ASMs].
|