Denoting the number of the data points nyby <math>n</math>, it is important to estimate the memory footprint and compute time, or number of arithmetic operations (AO) performed, as a function of <math>n</math>. No matter the algorithm of the spectral clustering, the two main costly items are the construction of the graph Laplacian and determining its <math>k</math> eigenvectors for the spectral embedding. The last step — determining the labels from the <math>n</math>-by-<math>k</math> matrix of eigenvectors — is typically the least expensive requiring only <math>kn</math> AO and creating just a <math>n</math>-by-<math>1</math> vector of the labels in memory.
The need to construct the graph Laplacian is common for all distance- or correlation-based clustering methods. Computing the eigenvectors is specific to spectral clustering only.