Functional regression: Difference between revisions

Content deleted Content added
{{Short description|
Tags: Mobile edit Mobile web edit Advanced mobile edit
Line 25:
where <math>X^c(\cdot) = X(\cdot) - \mathbb{E}(X(\cdot))</math> is still the centered functional covariate, <math>\beta_0(\cdot)</math> and <math>\beta(\cdot,\cdot)</math> are coefficient functions, and <math>\varepsilon(\cdot)</math> is usually assumed to be a random process with mean zero and finite variance. In this case, at any given time <math>t\in\mathcal{T}</math>, the value of <math>Y</math>, i.e., <math>Y(t)</math>, depends on the entire trajectory of <math>X</math>. Model ({{EquationNote|4}}), for any given time <math>t</math>, is an extension of [[multivariate linear regression]] with the inner product in Euclidean space replaced by that in <math>L^2</math>. An estimating equation motivated by multivariate linear regression is
<math display="block">r_{XY} = R_{XX}\beta, \text{ for } \beta\in L^2(\mathcal{S}\times\mathcal{S}),</math>
where <math>r_{XY}(s,t) = \text{cov}(X(s),Y(t))</math>, <math>R_{XX}: L^2(\mathcal{S}\times\mathcal{S}) \rightarrow L^2(\mathcal{S}\times\mathcal{T})</math> is defined as <math>(R_{XX}\beta)(s,t) = \int_\mathcal{S} r_{XX}(s,w)\beta(w,t)dw</math> with <math>r_{XX}(s,w) = \text{cov}(X(s),X(w))</math> for <math>s,w\in\mathcal{S}</math>.<ref name=wang:16/> Regularization is needed and can be done through truncation, <math>L^2</math> penalization or <math>L^1</math> penalization.<ref name=morr:15/> Various estimation methods for model ({{EquationNote|4}}) are available.<ref>{{Cite journal |last=Ramsay and|first=J. O. |last2=Dalzell (1991)|first2=C. "J. |date=1991 |title=Some toolsTools for functionalFunctional dataData analysis"Analysis |url=https://www.jstor.org/stable/2345586 ''|journal=Journal of the Royal Statistical Society. Series B (Methodological)''. '''|volume=53''' (|issue=3):539&ndash;572. https://www.jstor.org/stable/2345586.|pages=539–572 |issn=0035-9246}}</ref><ref>Yao, Müller and Wang (2005). "Functional linear regression analysis for longitudinal data". ''The Annals of Statistics''. '''33''' (6):2873&ndash;2903. [[Digital object identifier|doi]]:[http://doi.org/10.1214/009053605000000660 10.1214/009053605000000660].</ref><br />
When <math>X</math> and <math>Y</math> are concurrently observed, i.e., <math>\mathcal{S}=\mathcal{T}</math>,<ref>Grenander (1950). "Stochastic processes and statistical inference". ''Arkiv Matematik''. '''1''' (3):195&ndash;277. [[Digital object identifier|doi]]:[http://doi.org/10.1007/BF02590638 10.1007/BF02590638].</ref> it is reasonable to consider a historical functional linear model, where the current value of <math>Y</math> only depends on the history of <math>X</math>, i.e., <math>\beta(s,t)=0</math> for <math>s>t</math> in model ({{EquationNote|4}}).<ref name=wang:16/><ref>Malfait and Ramsay (2003). "The historical functional linear model". ''Canadian Journal of Statistics''. '''31''' (2):115&ndash;128. [[Digital object identifier|doi]]:[http://doi.org/10.2307/3316063 10.2307/3316063].</ref> A simpler version of the historical functional linear model is the functional concurrent model (see below).<br />
Adding multiple functional covariates, model ({{EquationNote|4}}) can be extended to