Partial least squares regression: Difference between revisions

Content deleted Content added
PrimeBOT (talk | contribs)
m External links: Task 24: removal of a template following a TFD
Citation bot (talk | contribs)
Add: bibcode, pmid, doi-access, s2cid, authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox | #UCB_webform_linked 294/924
Line 18:
 
A number of variants of PLS exist for estimating the factor and loading matrices {{mvar|T, U, P}} and {{mvar|Q}}. Most of them construct estimates of the linear regression between {{mvar|X}} and {{mvar|Y}} as <math>Y = X \tilde{B} + \tilde{B}_0</math>. Some PLS algorithms are only appropriate for the case where {{mvar|Y}} is a column vector, while others deal with the general case of a matrix {{mvar|Y}}. Algorithms also differ on whether they estimate the factor matrix {{mvar|T}} as an orthogonal (that is, [[orthonormal matrix|orthonormal]]) matrix or not.<ref>
{{cite journal |last1=Lindgren |first1=F |last2=Geladi |first2=P |last3=Wold |first3=S |title=The kernel algorithm for PLS |journal=J. Chemometrics |volume=7 |pages=45–59 |year=1993 |doi=10.1002/cem.1180070104 |s2cid=122950427 }}</ref><ref>{{cite journal |last1=de Jong |first1=S. |last2=ter Braak |first2=C.J.F. |title=Comments on the PLS kernel algorithm |journal=J. Chemometrics |volume=8 |issue=2 |pages=169–174 |year=1994 |doi=10.1002/cem.1180080208 }}</ref><ref>{{cite journal |last1=Dayal |first1=B.S. |last2=MacGregor |first2=J.F. |title=Improved PLS algorithms |journal=J. Chemometrics |volume=11 |issue=1 |pages=73–85 |year=1997 |doi=10.1002/(SICI)1099-128X(199701)11:1<73::AID-CEM435>3.0.CO;2-# }}</ref><ref>{{cite journal |last=de Jong |first=S. |title=SIMPLS: an alternative approach to partial least squares regression |journal=Chemometrics and Intelligent Laboratory Systems |volume=18 |pages=251–263 |year=1993 |doi=10.1016/0169-7439(93)85002-X |issue=3 }}</ref><ref>{{cite journal |last1=Rannar |first1=S. |last2=Lindgren |first2=F. |last3=Geladi |first3=P. |last4=Wold |first4=S. |title=A PLS Kernel Algorithm for Data Sets with Many Variables and Fewer Objects. Part 1: Theory and Algorithm |journal=J. Chemometrics |volume=8 |issue=2 |pages=111–125 |year=1994 |doi=10.1002/cem.1180080204 |s2cid=121613293 }}</ref><ref>{{cite journal |last=Abdi |first=H. |title=Partial least squares regression and projection on latent structure regression (PLS-Regression) |journal=Wiley Interdisciplinary Reviews: Computational Statistics |volume=2 |pages=97–106 |year=2010 |doi=10.1002/wics.51 }}</ref>
The final prediction will be the same for all these varieties of PLS, but the components will differ.
 
Line 47:
 
This form of the algorithm does not require centering of the input {{mvar|X}} and {{mvar|Y}}, as this is performed implicitly by the algorithm.
This algorithm features 'deflation' of the matrix {{mvar|X}} (subtraction of <math>t_k t^{(k)} {p^{(k)}}^\mathrm{T}</math>), but deflation of the vector {{mvar|y}} is not performed, as it is not necessary (it can be proved that deflating {{mvar|y}} yields the same results as not deflating<ref>{{cite journal |last1=Höskuldsson |first1=Agnar |title=PLS Regression Methods |journal=Journal of Chemometrics |date=1988 |volume=2 |issue=3 |page=219 |doi=10.1002/cem.1180020306 |s2cid=120052390 }}</ref>). The user-supplied variable {{mvar|l}} is the limit on the number of latent factors in the regression; if it equals the rank of the matrix {{mvar|X}}, the algorithm will yield the least squares regression estimates for {{mvar|B}} and <math>B_0</math>
 
==Extensions==
In 2002 a new method was published called orthogonal projections to latent structures (OPLS). In OPLS, continuous variable data is separated into predictive and uncorrelated information. This leads to improved diagnostics, as well as more easily interpreted visualization. However, these changes only improve the interpretability, not the predictivity, of the PLS models.<ref>{{Cite journal
| lastlast1 = Trygg
| firstfirst1 = J
| last2 = Wold
| first2 = S
Line 61:
| pages = 119–128
| year = 2002
| doi = 10.1002/cem.695}}| s2cid = 122699039
}}
</ref> L-PLS extends PLS regression to 3 connected data blocks.<ref>{{cite journal |last1=Sæbøa |first1=S. |last2=Almøya |first2=T. |last3=Flatbergb |first3=A. |last4=Aastveita |first4=A.H. |last5=Martens |first5=H. |title=LPLS-regression: a method for prediction and classification under the influence of background information on predictor variables |journal=Chemometrics and Intelligent Laboratory Systems |volume=91 |issue=2 |pages=121–132 |year=2008 |doi=10.1016/j.chemolab.2007.10.006 }}</ref> Similarly, OPLS-DA (Discriminant Analysis) may be applied when working with discrete variables, as in classification and biomarker studies.
 
In 2015 partial least squares was related to a procedure called the three-pass regression filter (3PRF).<ref>{{Cite journal|lastlast1=Kelly|firstfirst1=Bryan|last2=Pruitt|first2=Seth|date=2015-06-01|title=The three-pass regression filter: A new approach to forecasting using many predictors|journal=Journal of Econometrics|series=High Dimensional Problems in Econometrics|volume=186|issue=2|pages=294–316|doi=10.1016/j.jeconom.2015.02.011}}</ref> Supposing the number of observations and variables are large, the 3PRF (and hence PLS) is asymptotically normal for the "best" forecast implied by a linear latent factor model. In stock market data, PLS has been shown to provide accurate out-of-sample forecasts of returns and cash-flow growth.<ref>{{Cite journal|lastlast1=Kelly|firstfirst1=Bryan|last2=Pruitt|first2=Seth|date=2013-10-01|title=Market Expectations in the Cross-Section of Present Values|journal=The Journal of Finance|volume=68|issue=5|pages=1721–1756|doi=10.1111/jofi.12060|issn=1540-6261|citeseerx=10.1.1.498.5973}}</ref>
 
A PLS version based on [[Singular value decomposition|singular value decomposition (SVD)]] provides a memory efficient implementation that can be used to address high-dimensional problems, such as relating millions of genetic markers to thousands of imaging features in imaging genetics, on consumer-grade hardware.<ref>{{Cite journal|lastlast1=Lorenzi|firstfirst1=Marco|last2=Altmann|first2=Andre|last3=Gutman|first3=Boris|last4=Wray|first4=Selina|last5=Arber|first5=Charles|last6=Hibar|first6=Derrek P.|last7=Jahanshad|first7=Neda|last8=Schott|first8=Jonathan M.|last9=Alexander|first9=Daniel C.|date=2018-03-20|title=Susceptibility of brain atrophy to TRIB3 in Alzheimer's disease, evidence from functional prioritization in imaging genetics|journal=Proceedings of the National Academy of Sciences|volume=115|issue=12|pages=3162–3167|doi=10.1073/pnas.1706100115|issn=0027-8424|pmc=5866534|pmid=29511103|doi-access=free}}</ref>
 
PLS correlation (PLSC) is another methodology related to PLS regression,<ref name=":0">{{Cite journal|lastlast1=Krishnan|firstfirst1=Anjali|last2=Williams|first2=Lynne J.|last3=McIntosh|first3=Anthony Randal|last4=Abdi|first4=Hervé|date=May 2011|title=Partial Least Squares (PLS) methods for neuroimaging: A tutorial and review|journal=NeuroImage|volume=56|issue=2|pages=455–475|doi=10.1016/j.neuroimage.2010.07.034|pmid=20656037|s2cid=8796113}}</ref> which has been used in neuroimaging <ref name=":0" /><ref>{{Cite journal|lastlast1=McIntosh|firstfirst1=Anthony R.|last2=Mišić|first2=Bratislav|date=2013-01-03|title=Multivariate Statistical Analyses for Neuroimaging Data|journal=Annual Review of Psychology|volume=64|issue=1|pages=499–525|doi=10.1146/annurev-psych-113011-143804|pmid=22804773|issn=0066-4308}}</ref><ref>{{Cite journal|lastlast1=Beggs|firstfirst1=Clive B.|last2=Magnano|first2=Christopher|last3=Belov|first3=Pavel|last4=Krawiecki|first4=Jacqueline|last5=Ramasamy|first5=Deepa P.|last6=Hagemeier|first6=Jesper|last7=Zivadinov|first7=Robert|date=2016-05-02|editor-last=de Castro|editor-first=Fernando|title=Internal Jugular Vein Cross-Sectional Area and Cerebrospinal Fluid Pulsatility in the Aqueduct of Sylvius: A Comparative Study between Healthy Subjects and Multiple Sclerosis Patients|journal=PLOS ONE|volume=11|issue=5|pages=e0153960|doi=10.1371/journal.pone.0153960|issn=1932-6203|pmc=4852898|pmid=27135831|bibcode=2016PLoSO..1153960B|doi-access=free}}</ref> and more recently in sport science,<ref>{{Cite journal|lastlast1=Weaving|firstfirst1=Dan|last2=Jones|first2=Ben|last3=Ireton|first3=Matt|last4=Whitehead|first4=Sarah|last5=Till|first5=Kevin|last6=Beggs|first6=Clive B.|date=2019-02-14|editor-last=Connaboy|editor-first=Chris|title=Overcoming the problem of multicollinearity in sports performance data: A novel application of partial least squares correlation analysis|journal=PLOS ONE|volume=14|issue=2|pages=e0211776|doi=10.1371/journal.pone.0211776|pmid=30763328|issn=1932-6203|pmc=6375576|bibcode=2019PLoSO..1411776W|doi-access=free}}</ref> to quantify the strength of the relationship between data sets. Typically, PLSC divides the data into two blocks (sub-groups) each containing one or more variables, and then uses [[Singular value decomposition|singular value decomposition (SVD)]] to establish the strength of any relationship (i.e. the amount of shared information) that might exist between the two component sub-groups.<ref name=":1">{{Citation|lastlast1=Abdi|firstfirst1=Hervé|title=Partial Least Squares Methods: Partial Least Squares Correlation and Partial Least Square Regression|date=2013|work=Computational Toxicology|volume=930|pages=549–579|editor-last=Reisfeld|editor-first=Brad|publisher=Humana Press|doi=10.1007/978-1-62703-059-5_23|isbn=9781627030588|last2=Williams|first2=Lynne J.|pmid=23086857|editor2-last=Mayeno|editor2-first=Arthur N.}}</ref> It does this by using SVD to determine the inertia (i.e. the sum of the singular values) of the covariance matrix of the sub-groups under consideration.<ref name=":1" /><ref name=":0" />
 
==See also==