Multidimensional scaling: Difference between revisions

Content deleted Content added
-> Democratic
Added a section on Super MDS
Line 77:
{{main|Generalized multidimensional scaling}}
An extension of metric multidimensional scaling, in which the target space is an arbitrary smooth non-Euclidean space. In cases where the dissimilarities are distances on a surface and the target space is another surface, GMDS allows finding the minimum-distortion embedding of one surface into another.<ref name="bron">{{cite journal |vauthors=Bronstein AM, Bronstein MM, Kimmel R |title=Generalized multidimensional scaling: a framework for isometry-invariant partial surface matching |journal=Proc. Natl. Acad. Sci. U.S.A. |volume=103 |issue=5 |pages=1168–72 |date=January 2006 |pmid=16432211 |pmc=1360551 |doi=10.1073/pnas.0508601103 |bibcode=2006PNAS..103.1168B |doi-access=free }}</ref>
 
=== Super multidimensional scaling (SMDS) ===
 
An extension of MDS, known as Super MDS, incorporates both distance and angle information for improved source localization. Unlike traditional MDS, which uses only distance measurements, Super MDS processes both distance and angle-of-arrival (AOA) data algebraically (without iteration) to achieve better accuracy.<ref>{{cite conference |last1=de Abreu |first1=G. T. F. |last2=Destino |first2=G. |title=Super MDS: Source Location from Distance and Angle Information |conference=2007 IEEE Wireless Communications and Networking Conference |___location=Hong Kong, China |pages=4430-4434 |year=2007 |doi=10.1109/WCNC.2007.807}}</ref>
 
The method proceeds in the following steps:
 
# '''Construct the Reduced Edge Gram Kernel:''' For a network of <math>N</math> sources in an <math>\eta</math>-dimensional space, define the edge vectors as <math>v_{i} = x_{m} - x_{n}</math>. The dissimilarity is given by <math>k_{i,j} = \langle v_i, v_j \rangle</math>. Assemble these into the full kernel <math>K = VV^T</math>, and then form the reduced kernel using the <math>N-1</math> independent vectors: <math>\bar{K} = [V]_{(N-1)\times\eta}\ [V]_{(N-1)\times\eta}^T</math>,
# '''Eigen-Decomposition:''' Compute the eigen-decomposition of <math>\bar{K}</math>,
# '''Estimate Edge Vectors:''' Recover the edge vectors as <math> \hat{V} = \Bigl( U_{M \times \eta}\, \Lambda^{\odot \frac{1}{2}}_{\eta \times \eta} \Bigr)^T </math>,
# '''Procrustes Alignment:''' Retrieve <math>\hat{V}</math> from <math>V</math> via Procrustes Transformation,
# '''Compute Coordinates:''' Solve the following linear equations to compute the coordinate estimates <math>\begin{pmatrix}
1 \vline \mathbf{0}_{1 \times N-1} \\
\hline
\mathbf{[C]}_{N-1 \times N}
\end{pmatrix} \cdot \begin{pmatrix}\mathbf{x}_{1} \\
\hline[\mathbf{X}]_{N-1 \times \eta}
\end{pmatrix}=\begin{pmatrix}
\mathbf{x}_{1} \\
\hline[\mathbf{V}]_{N-1 \times \eta}
\end{pmatrix},
</math>
 
This concise approach reduces the need for multiple anchors and enhances localization precision by leveraging angle constraints.
 
==Details==