Content deleted Content added
Citation bot (talk | contribs) Add: pmid, bibcode. | Use this bot. Report bugs. | Suggested by Abductive | Category:Orphaned articles from April 2025 | #UCB_Category 528/627 |
m Open access bot: url-access updated in citation with #oabot. |
||
Line 11:
=== Canonical-correlation analysis based methods ===
[[Canonical correlation|Canonical-correlation analysis]] (CCA) was first introduced in 1936 by [[Harold Hotelling]]<ref>{{Cite journal |last=Hotelling |first=H. |date=1936-12-01 |title=Relations Between Two Sets of Variates |url=https://academic.oup.com/biomet/article-lookup/doi/10.1093/biomet/28.3-4.321 |journal=Biometrika |language=en |volume=28 |issue=3–4 |pages=321–377 |doi=10.1093/biomet/28.3-4.321 |issn=0006-3444|url-access=subscription }}</ref> and is a fundamental approach for multimodal learning. CCA aims to find linear relationships between two sets of variables. Given two data [[matrices]] <math>X \in \mathbb{R}^{n \times p} </math> and <math>Y \in \mathbb{R}^{n \times q}</math> representing different modalities, CCA finds projection vectors <math>w_x\in\mathbb{R}^p
</math> and <math>w_y\in\mathbb{R}^q </math> that maximizes the correlation between the projected variables:
Line 34:
</math> memory requirement for sorting kernel matrices.
KCCA was proposed independently by several researchers.<ref>{{Cite journal |last=Lai |first=P |date=October 2000 |title=Kernel and Nonlinear Canonical Correlation Analysis |url=http://linkinghub.elsevier.com/retrieve/pii/S012906570000034X |journal=International Journal of Neural Systems |volume=10 |issue=5 |pages=365–377 |doi=10.1016/S0129-0657(00)00034-X|pmid=11195936 |url-access=subscription }}</ref><ref>{{Cite web |title=Kernel Independent Component Analysis {{!}} EECS at UC Berkeley |url=https://www2.eecs.berkeley.edu/Pubs/TechRpts/2001/5721.html |access-date=2025-04-16 |website=www2.eecs.berkeley.edu}}</ref><ref>{{Cite book |last1=Dorffner |first1=Georg |title=Artificial Neural Networks -- ICANN 2001: International Conference Vienna, Austria, August 21-25, 2001 Proceedings |last2=Bischof |first2=Horst |last3=Hornik |first3=Kurt |date=2001 |publisher=Springer-Verlag Berlin Heidelberg Springer e-books |isbn=978-3-540-44668-2 |series=Lecture Notes in Computer Science |___location=Berlin, Heidelberg}}</ref><ref>{{Citation |last=Akaho |first=Shotaro |title=A kernel method for canonical correlation analysis |date=2007-02-14 |arxiv=cs/0609071 |id=arXiv:cs/0609071}}</ref>
==== Deep CCA ====
Line 64:
==== Alternating diffusion ====
Alternating diffusion based methods provide another strategy for multimodal representation learning by focusing on extracting the common underlying sources of variability present across multiple views or sensors. These methods aim to filter out sensor-specific or nuisance components, assuming that the phenomenon of interest is captured by two or more sensors. The core idea involves constructing an alternating diffusion operator by sequentially applying diffusion processes derived from each modality, typically through their product or intersection. This process allows the method to capture the structure related to common hidden variables that drive the observed multimodal data.<ref>{{Cite journal |last1=Katz |first1=Ori |last2=Talmon |first2=Ronen |last3=Lo |first3=Yu-Lun |last4=Wu |first4=Hau-Tieng |date=January 2019 |title=Alternating diffusion maps for multimodal data fusion |url=https://linkinghub.elsevier.com/retrieve/pii/S1566253517300192 |journal=Information Fusion |language=en |volume=45 |pages=346–360 |doi=10.1016/j.inffus.2018.01.007|url-access=subscription }}</ref>
== See also ==
|