Content deleted Content added
Corrected author names in reference "Spectral clustering based..." |
→Algorithms: replaced SPAN with the classical Lanczos algorithm, added info on preconditioning |
||
Line 20:
A related algorithm is the '''[[Meila–Shi algorithm]]''',<ref>Marina Meilă & Jianbo Shi, "[http://www.citeulike.org/user/mpotamias/article/498897 Learning Segmentation by Random Walks]", Neural Information Processing Systems 13 (NIPS 2000), 2001, pp. 873–879.</ref> which takes the [[eigenvector]]s corresponding to the ''k'' largest [[eigenvalue]]s of the matrix <math>P = D^{-1}S</math> for some ''k'', and then invokes another algorithm (e.g. [[k-means clustering]]) to cluster points by their respective ''k'' components in these eigenvectors.
An efficiency
For large-size graph, the second eigenvalue of the (normalized) graph [[Laplacian matrix]] is often [[ill-conditioned]], leading to slow convergence of iterative eigenvalue solvers. [[Preconditioner#Preconditioning_for_eigenvalue_problems|Preconditioning]] is a key technology accelerating the convergence, e.g., in the matrix-free [[LOBPCG]] method.
Spectral clustering is closely related to [[Nonlinear dimensionality reduction]], and dimension reduction techniques such as locally-linear embedding can be used to reduce errors from noise or outliers.<ref>{{Citation
|