Content deleted Content added
→Non-metric multidimensional scaling (NMDS): cleaned up |
|||
Line 51:
===Non-metric multidimensional scaling (NMDS)===
In contrast to metric MDS, non-metric MDS finds both a [[non-parametric]] [[monotonic]] relationship between the dissimilarities in the item-item matrix and the Euclidean distances between items, and the ___location of each item in the low-dimensional space.
:<blockquote><math>\text{Stress}=\sqrt{\frac{\sum\bigl(f(x)-d\bigr)^2}{\sum d^2}}.</math></blockquote>▼
Now, for each choice of the embedded points <math>x_i</math> and is a monotonically increasing function <math>f</math>, define the "stress" function:
▲:<blockquote><math>
The factor of <math>\sum_{i<j} \hat d_{ij}^2</math> in the denominator is necessary to prevent a "collapse". Suppose we define instead <math>S=\sqrt{\sum_{i<j}\bigl(f(d_{ij})-\hat d_{ij})^2}</math>, then it can be trivially minimized by setting <math>f = 0</math>, then collapse every point to the same point.
A few variants of this cost function exist. MDS programs automatically minimize stress in order to obtain the MDS solution.
The core of a non-metric MDS algorithm is a twofold optimization process. First the optimal monotonic transformation of the proximities has to be found. Secondly, the points of a configuration have to be optimally arranged, so that their distances match the scaled proximities as closely as possible.
:# Find a random configuration of points, e. g. by sampling from a normal distribution. ▼
NMDS needs to optimize two objectives simultaneously. This is usually done iteratively:
▲:#
:# Do until a stopping criterion (for example, <math>S < \epsilon</math>)
:## Solve for <math>f = \arg\min_f S(x_1, ..., x_n ; f)</math> by [[isotonic regression]].
:## Solve for <math>x_1, ..., x_n = \arg\min_{x_1, ..., x_n} S(x_1, ..., x_n ; f)</math> by gradient descent or other methods.
:#Return <math>x_i</math> and <math>f</math>
[[Louis Guttman]]'s smallest space analysis (SSA) is an example of a non-metric MDS procedure.
|