Functional regression: Difference between revisions

Content deleted Content added
Ms.chen (talk | contribs)
OAbot (talk | contribs)
m Open access bot: url-access updated in citation with #oabot.
 
(200 intermediate revisions by 17 users not shown)
Line 1:
{{Short description|Type of regression analysis}}
{{User sandbox}}
'''Functional regression''' is ana extensionversion of the [[Regressionregression analysis|traditional multivariate regression]] with scalarwhen [[Dependent and independent variables|responses]] and scalaror [[Dependent and independent variables|covariates]], which allows one to conduct regression analysis oninclude [[Functional data analysis|functional data]]. One the one hand, functionalFunctional regression models can be classified into threefour types baseddepending on whether the responses or covariates are functional or scalar: (i) scalar responses with functional covariates, (ii) functional responses with scalar covariates, (iii) functional responses with functional covariates, and (iv) scalar or functional responses with functional and scalar covariates. On the otherIn handaddition, functional regression models can be [[Linear regression|linear]], partially linear, or [[Nonlinear regression|nonlinear]]. In particular, functional polynomial models, functional [[Semiparametric_regressionSemiparametric regression#Index_modelsIndex models|single and multiple singleindex models]] and functional [[Additive model|additive modelsmodel]]s are three special cases of functional nonlinear models.
<!-- EDIT BELOW THIS LINE -->
 
= Functional regression =
__TOC__
'''Functional regression''' is an extension of the [[Regression analysis|traditional multivariate regression]] with scalar [[Dependent and independent variables|responses]] and scalar [[Dependent and independent variables|covariates]], which allows one to conduct regression analysis on [[Functional data analysis|functional data]]. One the one hand, functional regression models can be classified into three types based on whether the responses or covariates are functional or scalar: (i) scalar responses with functional covariates, (ii) functional responses with scalar covariates, (iii) functional responses with functional covariates, and (iv) scalar or functional responses with functional and scalar covariates. On the other hand, functional regression models can be [[Linear regression|linear]], partially linear, or [[Nonlinear regression|nonlinear]]. In particular, functional polynomial models, functional [[Semiparametric_regression#Index_models|single and multiple single models]] and functional [[Additive model|additive models]] are three special cases of functional nonlinear models.
 
== Functional linear models (FLMs) ==
Functional linear models (FLMs) are an extension of [[Linear regression|traditional multivariate linear models]] (LMs). A linear model with scalar response <math>Y\in\mathbb{R}</math> and scalar covariates <math>X\in\mathbb{R}^p</math> can be written as
{{NumBlk|::|<math display="block">Y = \beta_0 + \langle X,\beta\rangle + \varepsilon,</math>|{{EquationRef|1}}}}
where <math>\langle\cdot,\cdot\rangle</math> denotes the [[Inner product space|inner product]] in [[Euclidean space]], <math>\beta_0\in\mathbb{R}</math> and <math>\beta\in\mathbb{R}^p</math> denote the regression coefficients, and <math>\varepsilon</math> is a random error with [[Expected value|mean]] zero and finite [[variance]]. FLMs can be divided into two types based on the responses.
 
=== Functional linear models with scalar responses ===
Functional linear models with scalar responses can be obtained by replacing the scalar covariates <math>X</math> and the coefficient vector <math>\beta</math> in model ({{EquationNote|1}}) by a centered functional covariate <math>X^c(\cdot) = X(\cdot) - \mathbb{E}(X(\cdot))</math> and a coefficient function <math>\beta = \beta(\cdot)</math> with [[Domain of a function|___domain]] <math>\mathcal{T}</math>, respectively, and replacing the inner product in Euclidean space by that in [[Hilbert space]] [[Lp space|<math>L^2</math>]],
{{NumBlk|::|<math display="block">Y = \beta_0 + \langle X^c, \beta\rangle +\varepsilon = \beta_0 + \int_\mathcal{T} X^c(t)\beta(t)\,dt + \varepsilon,</math>|{{EquationRef|2}}}}
where <math>\langle \cdot, \cdot \rangle</math> here denotes the inner product in <math>L^2</math>. One approach to estimating <math>\beta_0</math> and <math>\beta(\cdot)</math> is to expand the centered covariate <math>X^c(\cdot)</math> and the coefficient function <math>\beta(\cdot)</math> in the same [[Basis function|functional basis]], for example, [[B-spline]] basis or the eigenbasis used in the [[Karhunen&ndash;Loève theorem|Karhunen&ndash;Loève expansion]]. Suppose <math>\{\phi_k\}_{k=1}^\infty</math> is an [[orthonormal basis]] of <math>L^2</math>. Expanding <math>X^c</math> and <math>\beta</math> in this basis, <math>X^c(\cdot) = \sum_{k=1}^\infty x_k \phi_k(\cdot)</math>, <math>\beta(\cdot) = \sum_{k=1}^\infty \beta_k \phi_k(\cdot)</math>, model ({{EquationNote|2}}) becomes
<math display="block">Y = \beta_0 + \sum_{k=1}^\infty \beta_k x_k +\varepsilon.</math>
For implementation, regularization is needed and can be done through truncation, <math>L^2</math> penalization or <math>L^1</math> penalization.<ref name=morr:15>{{cite journal|doi=10.1146/annurev-statistics-010814-020413|title=Functional Regression|year=2015|last1=Morris|first1=Jeffrey S.|journal=[[Annual Review of Statistics and Its Application]]|volume=2|issue=1|pages=321–359|arxiv=1406.4068|bibcode=2015AnRSA...2..321M|s2cid=18637009}}</ref> In addition, a [[reproducing kernel Hilbert space]] (RKHS) approach can also be used to estimate <math>\beta_0</math> and <math>\beta(\cdot)</math> in model ({{EquationNote|2}})<ref>Yuan and Cai (2010). "A reproducing kernel Hilbert space approach to functional linear regression". ''The Annals of Statistics''. '''38''' (6):3412&ndash;3444. [[Digital object identifier|doi]]:[http://doi.org/10.1214/09-AOS772 10.1214/09-AOS772].</ref>
 
Adding multiple functional and scalar covariates, model ({{EquationNote|2}}) can be extended to
{{NumBlk|::|<math display="block">Y = \sum_{k=1}^q Z_k\alpha_k + \sum_{j=1}^p \int_{\mathcal{T}_j} X_j^c(t) \beta_j(t) \,dt + \varepsilon,</math>|{{EquationRef|3}}}}
where <math>Z_1,\ldots,Z_q</math> are scalar covariates with <math>Z_1=1</math>, <math>\alpha_1,\ldots,\alpha_q</math> are regression coefficients for <math>Z_1,\ldots,Z_q</math>, respectively, <math>X^c_j</math> is a centered functional covariate given by <math>X_j^c(\cdot) = X_j(\cdot) - \mathbb{E}(X_j(\cdot))</math>, <math>\beta_j</math> is regression coefficient function for <math>X_j^c(\cdot)</math>, and <math>\mathcal{T}_j</math> is the ___domain of <math>X_j</math> and <math>\beta_j</math>, for <math>j=1,\ldots,p</math>. However, due to the parametric component <math>\alpha</math>, the estimation methods for model ({{EquationNote|2}}) cannot be used in this case<ref name=wang:16>{{cite journal|doi=10.1146/annurev-statistics-041715-033624|title=Functional Data Analysis|year=2016|last1=Wang|first1=Jane-Ling|last2=Chiou|first2=Jeng-Min|last3=Müller|first3=Hans-Georg|journal=[[Annual Review of Statistics and Its Application]]|volume=3|issue=1|pages=257–295|bibcode=2016AnRSA...3..257W|url=https://zenodo.org/record/895750|doi-access=free}}</ref> and alternative estimation methods for model ({{EquationNote|3}}) are available.<ref>{{Cite journal |last=Kong |first=Dehan |last2=Xue |first2=Kaijie |last3=Yao |first3=Fang |last4=Zhang |first4=Hao H. |date= |title=Partially functional linear regression in high dimensions |url=https://academic.oup.com/biomet/article-lookup/doi/10.1093/biomet/asv062 |journal=Biometrika |language=en |volume=103 |issue=1 |pages=147–159 |doi=10.1093/biomet/asv062 |issn=0006-3444|url-access=subscription }}</ref><ref>{{Cite journal |last=Hu |first=Z. |date=2004-06-01 |title=Profile-kernel versus backfitting in the partially linear models for longitudinal/clustered data |url=https://academic.oup.com/biomet/article-lookup/doi/10.1093/biomet/91.2.251 |journal=Biometrika |language=en |volume=91 |issue=2 |pages=251–262 |doi=10.1093/biomet/91.2.251 |issn=0006-3444|url-access=subscription }}</ref>
 
=== Functional linear models with functional responses ===
For a functional response <math>Y(\cdot)</math> with ___domain <math>\mathcal{T}</math> and a functional covariate <math>X(\cdot)</math> with ___domain <math>\mathcal{S}</math>, two FLMs regressing <math>Y(\cdot)</math> on <math>X(\cdot)</math> have been considered.<ref name=wang:16/><ref>Ramsay and [[Bernard Silverman|Silverman]] (2005). ''Functional data analysis'', 2nd ed., New York: Springer, {{ISBN|0-387-40080-X}}.</ref> One of these two models is of the form
{{NumBlk|::|<math display="block">Y(t) = \beta_0(t) + \int_{\mathcal{S}} \beta(s,t) X^c(s)\,ds + \varepsilon(t),\ \text{for}\ t\in\mathcal{T},</math>|{{EquationRef|4}}}}
where <math>X^c(\cdot) = X(\cdot) - \mathbb{E}(X(\cdot))</math> is still the centered functional covariate, <math>\beta_0(\cdot)</math> and <math>\beta(\cdot,\cdot)</math> are coefficient functions, and <math>\varepsilon(\cdot)</math> is usually assumed to be a random process with mean zero and finite variance. In this case, at any given time <math>t\in\mathcal{T}</math>, the value of <math>Y</math>, i.e., <math>Y(t)</math>, depends on the entire trajectory of <math>X</math>. Model ({{EquationNote|4}}), for any given time <math>t</math>, is an extension of [[multivariate linear regression]] with the inner product in Euclidean space replaced by that in <math>L^2</math>. An estimating equation motivated by multivariate linear regression is
<math display="block">r_{XY} = R_{XX}\beta, \text{ for } \beta\in L^2(\mathcal{S}\times\mathcal{S}),</math>
where <math>r_{XY}(s,t) = \text{cov}(X(s),Y(t))</math>, <math>R_{XX}: L^2(\mathcal{S}\times\mathcal{S}) \rightarrow L^2(\mathcal{S}\times\mathcal{T})</math> is defined as <math>(R_{XX}\beta)(s,t) = \int_\mathcal{S} r_{XX}(s,w)\beta(w,t)dw</math> with <math>r_{XX}(s,w) = \text{cov}(X(s),X(w))</math> for <math>s,w\in\mathcal{S}</math>.<ref name=wang:16/> Regularization is needed and can be done through truncation, <math>L^2</math> penalization or <math>L^1</math> penalization.<ref name=morr:15/> Various estimation methods for model ({{EquationNote|4}}) are available.<ref>{{Cite journal |last=Ramsay |first=J. O. |last2=Dalzell |first2=C. J. |date=1991 |title=Some Tools for Functional Data Analysis |url=https://www.jstor.org/stable/2345586 |journal=Journal of the Royal Statistical Society. Series B (Methodological) |volume=53 |issue=3 |pages=539–572 |issn=0035-9246}}</ref><ref>{{Cite journal |last=Yao |first=Fang |last2=Müller |first2=Hans-Georg |last3=Wang |first3=Jane-Ling |date= |title=Functional linear regression analysis for longitudinal data |url=https://projecteuclid.org/journals/annals-of-statistics/volume-33/issue-6/Functional-linear-regression-analysis-for-longitudinal-data/10.1214/009053605000000660.full |journal=The Annals of Statistics |volume=33 |issue=6 |pages=2873–2903 |doi=10.1214/009053605000000660 |issn=0090-5364|arxiv=math/0603132 }}</ref><br />
When <math>X</math> and <math>Y</math> are concurrently observed, i.e., <math>\mathcal{S}=\mathcal{T}</math>,<ref>{{Cite journal |last=Grenander |first=Ulf |date= |title=Stochastic processes and statistical inference |url=https://projecteuclid.org/journals/arkiv-for-matematik/volume-1/issue-3/Stochastic-processes-and-statistical-inference/10.1007/BF02590638.full |journal=Arkiv för Matematik |volume=1 |issue=3 |pages=195–277 |doi=10.1007/BF02590638 |issn=0004-2080}}</ref> it is reasonable to consider a historical functional linear model, where the current value of <math>Y</math> only depends on the history of <math>X</math>, i.e., <math>\beta(s,t)=0</math> for <math>s>t</math> in model ({{EquationNote|4}}).<ref name=wang:16/><ref>{{Cite journal |last=Malfait |first=Nicole |last2=Ramsay |first2=James O. |date=2003 |title=The historical functional linear model |url=https://onlinelibrary.wiley.com/doi/10.2307/3316063 |journal=Canadian Journal of Statistics |language=en |volume=31 |issue=2 |pages=115–128 |doi=10.2307/3316063 |issn=1708-945X|url-access=subscription }}</ref> A simpler version of the historical functional linear model is the functional concurrent model (see below).<br />
Adding multiple functional covariates, model ({{EquationNote|4}}) can be extended to
{{NumBlk|::|<math display="block">Y(t) = \beta_0(t) + \sum_{j=1}^p\int_{\mathcal{S}_j} \beta_j(s,t) X^c_j(s)\,ds + \varepsilon(t),\ \text{for}\ t\in\mathcal{T},</math>|{{EquationRef|5}}}}
where for <math>j=1,\ldots,p</math>, <math>X_j^c(\cdot)=X_j(\cdot) - \mathbb{E}(X_j(\cdot))</math> is a centered functional covariate with ___domain <math>\mathcal{S}_j</math>, and <math>\beta_j(\cdot,\cdot)</math> is the corresponding coefficient function with the same ___domain, respectively.<ref name=wang:16/> In particular, taking <math>X_j(\cdot)</math> as a constant function yields a special case of model ({{EquationNote|5}})
<math display="block">Y(t) = \sum_{j=1}^p X_j \beta_j(t) + \varepsilon(t),\ \text{for}\ t\in\mathcal{T},</math>
which is a FLM with functional responses and scalar covariates.
 
==== Functional concurrent models ====
Assuming that <math>\mathcal{S} = \mathcal{T}</math>, another model, known as the functional concurrent model, sometimes also referred to as the varying-coefficient model, is of the form
{{NumBlk|::|<math display="block">Y(t) = \alpha_0(t) + \alpha(t)X(t)+\varepsilon(t),\ \text{for}\ t\in\mathcal{T},</math>|{{EquationRef|6}}}}
where <math>\alpha_0</math> and <math>\alpha</math> are coefficient functions. Note that model ({{EquationNote|6}}) assumes the value of <math>Y</math> at time <math>t</math>, i.e., <math>Y(t)</math>, only depends on that of <math>X</math> at the same time, i.e., <math>X(t)</math>. Various estimation methods can be applied to model ({{EquationNote|6}}).<ref>{{Cite journal |last=Fan |first=Jianqing |last2=Zhang |first2=Wenyang |date= |title=Statistical estimation in varying coefficient models |url=https://projecteuclid.org/journals/annals-of-statistics/volume-27/issue-5/Statistical-estimation-in-varying-coefficient-models/10.1214/aos/1017939139.full |journal=The Annals of Statistics |volume=27 |issue=5 |pages=1491–1518 |doi=10.1214/aos/1017939139 |issn=0090-5364}}</ref><ref>{{Cite journal |last=Huang |first=Jianhua Z. |last2=Wu |first2=Colin O. |last3=Zhou |first3=Lan |date=2004 |title=Polynomial Spline Estimation and Inference for Varying Coefficient Models with Longitudinal Data |url=https://www.jstor.org/stable/24307415 |journal=Statistica Sinica |volume=14 |issue=3 |pages=763–788 |issn=1017-0405}}</ref><ref>{{Cite journal |last=Şentürk |first=Damla |last2=Müller |first2=Hans-Georg |date=2010-09-01 |title=Functional Varying Coefficient Models for Longitudinal Data |url=https://www.tandfonline.com/doi/abs/10.1198/jasa.2010.tm09228 |journal=Journal of the American Statistical Association |doi=10.1198/jasa.2010.tm09228 |issn=0162-1459|url-access=subscription }}</ref><br />
Adding multiple functional covariates, model ({{EquationNote|6}}) can also be extended to
<math display="block">Y(t) = \alpha_0(t) + \sum_{j=1}^p\alpha_j(t)X_j(t)+\varepsilon(t),\ \text{for}\ t\in\mathcal{T},</math>
where <math>X_1,\ldots,X_p</math> are multiple functional covariates with ___domain <math>\mathcal{T}</math> and <math>\alpha_0,\alpha_1,\ldots,\alpha_p</math> are the coefficient functions with the same ___domain.<ref name=wang:16/>
 
== Functional regressionnonlinear models ==
=== Functional polynomial models ===
Functional polynomial models are an extension of the FLMs with scalar responses, analogous to extending linear regression to [[polynomial regression]]. For a scalar response <math>Y</math> and a functional covariate <math>X(\cdot)</math> with ___domain <math>\mathcal{T}</math>, the simplest example of functional polynomial models is functional quadratic regression<ref name="yao:10">{{Cite journal |last=Yao |first=F. |last2=Muller |first2=H.-G. |date=2010-03-01 |title=Functional quadratic regression |url=https://academic.oup.com/biomet/article-lookup/doi/10.1093/biomet/asp069 |journal=Biometrika |language=en |volume=97 |issue=1 |pages=49–64 |doi=10.1093/biomet/asp069 |issn=0006-3444|url-access=subscription }}</ref>
<math display="block">Y = \alpha + \int_\mathcal{T}\beta(t)X^c(t)\,dt + \int_\mathcal{T} \int_\mathcal{T} \gamma(s,t) X^c(s)X^c(t) \,ds\,dt + \varepsilon,</math>
where <math>X^c(\cdot) = X(\cdot) - \mathbb{E}(X(\cdot))</math> is the centered functional covariate, <math>\alpha</math> is a scalar coefficient, <math>\beta(\cdot)</math> and <math>\gamma(\cdot,\cdot)</math> are coefficient functions with domains <math>\mathcal{T}</math> and <math>\mathcal{T}\times\mathcal{T}</math>, respectively, and <math>\varepsilon</math> is a random error with mean zero and finite variance. By analogy to FLMs with scalar responses, estimation of functional polynomial models can be obtained through expanding both the centered covariate <math>X^c</math> and the coefficient functions <math>\beta</math> and <math>\gamma</math> in an orthonormal basis.<ref name=yao:10/>
 
=== Functional single and multiple index models ===
A functional multiple index model is given by
<math display="block">Y = g\left(\int_{\mathcal{T}} X^c(t) \beta_1(t)\,dt, \ldots, \int_{\mathcal{T}} X^c(t) \beta_p(t)\,dt \right) + \varepsilon.</math>
Taking <math>p=1</math> yields a functional single index model. However, for <math>p>1</math>, this model is problematic due to [[curse of dimensionality]]. With <math>p>1</math> and relatively small sample sizes, the estimator given by this model often has large variance.<ref name="chen:11">{{Cite journal |last=Chen |first=Dong |last2=Hall |first2=Peter |last3=Müller |first3=Hans-Georg |date= |title=Single and multiple index functional regression models with nonparametric link |url=https://projecteuclid.org/journals/annals-of-statistics/volume-39/issue-3/Single-and-multiple-index-functional-regression-models-with-nonparametric-link/10.1214/11-AOS882.full |journal=The Annals of Statistics |volume=39 |issue=3 |pages=1720–1747 |doi=10.1214/11-AOS882 |issn=0090-5364|arxiv=1211.5018 }}</ref> An alternative <math>p</math>-component functional multiple index model can be expressed as
<math display="block">Y = g_1\left(\int_{\mathcal{T}} X^c(t) \beta_1(t)\,dt\right)+ \cdots+ g_p\left(\int_{\mathcal{T}} X^c(t) \beta_p(t)\,dt \right) + \varepsilon.</math>
Estimation methods for functional single and multiple index models are available.<ref name=chen:11/><ref>{{Cite journal |last=Jiang |first=Ci-Ren |last2=Wang |first2=Jane-Ling |date= |title=Functional single index models for longitudinal data |url=https://projecteuclid.org/journals/annals-of-statistics/volume-39/issue-1/Functional-single-index-models-for-longitudinal-data/10.1214/10-AOS845.full |journal=The Annals of Statistics |volume=39 |issue=1 |pages=362–388 |doi=10.1214/10-AOS845 |issn=0090-5364|arxiv=1103.1726 }}</ref>
 
=== Functional additive models (FAMs) ===
Given an expansion of a functional covariate <math>X</math> with ___domain <math>\mathcal{T}</math> in an orthonormal basis <math>\{\phi_k\}_{k=1}^\infty</math>: <math>X(t) = \sum_{k=1}^\infty x_k \phi_k(t)</math>, a functional linear model with scalar responses shown in model ({{EquationNote|2}}) can be written as
<math display="block">\mathbb{E}(Y|X)=\mathbb{E}(Y) + \sum_{k=1}^\infty \beta_k x_k.</math>
One form of FAMs is obtained by replacing the linear function of <math>x_k</math>, i.e., <math>\beta_k x_k</math>, by a general smooth function <math>f_k</math>,
<math display="block">\mathbb{E}(Y|X)=\mathbb{E}(Y) + \sum_{k=1}^\infty f_k(x_k),</math>
where <math>f_k</math> satisfies <math>\mathbb{E}(f_k(x_k))=0</math> for <math>k\in\mathbb{N}</math>.<ref name=wang:16/><ref>{{Cite journal |last=Müller |first=Hans-Georg |last2=Yao |first2=Fang |date=2008-12-01 |title=Functional Additive Models |url=https://www.tandfonline.com/doi/abs/10.1198/016214508000000751 |journal=Journal of the American Statistical Association |doi=10.1198/016214508000000751 |issn=0162-1459|url-access=subscription }}</ref> Another form of FAMs consists of a sequence of time-additive models:
<math display="block">\mathbb{E}(Y|X(t_1),\ldots,X(t_p))=\sum_{j=1}^p f_j(X(t_j)),</math>
where <math>\{t_1,\ldots,t_p\}</math> is a dense grid on <math>\mathcal{T}</math> with increasing size <math>p\in\mathbb{N}</math>, and <math>f_j(x) = g(t_j,x)</math> with <math>g</math> a smooth function, for <math>j=1,\ldots,p</math><ref name=wang:16/><ref>{{Cite journal |last=Fan |first=Yingying |last2=James |first2=Gareth M. |last3=Radchenko |first3=Peter |date= |title=Functional additive regression |url=https://projecteuclid.org/journals/annals-of-statistics/volume-43/issue-5/Functional-additive-regression/10.1214/15-AOS1346.full |journal=The Annals of Statistics |volume=43 |issue=5 |pages=2296–2325 |doi=10.1214/15-AOS1346 |issn=0090-5364|arxiv=1510.04064 }}</ref>
 
== Extensions ==
A direct extension of FLMs with scalar responses shown in model ({{EquationNote|2}}) is to add a link function to create a [[generalized functional linear model]] (GFLM) by analogy to extending [[linear regression]] to [[Generalized linear model|generalized linear regression]] (GLM), of which the three components are:
# Linear predictor <math>\eta = \beta_0 + \int_{\mathcal{T}} X^c(t)\beta(t)\,dt</math>;
# [[Variance function]] <math>\text{Var}(Y|X) = V(\mu)</math>, where <math>\mu = \mathbb{E}(Y|X)</math> is the [[Conditional expectation|conditional mean]];
# Link function <math>g</math> connecting the conditional mean and the linear predictor through <math>\mu=g(\eta)</math>.
 
== See also ==
* [[Functional data analysis]]
* [[Functional principal component analysis]]
* [[Karhunen&ndash;Loève theorem]]
* [[Generalized functional linear model]]
* [[Stochastic processes]]
 
== References ==
<references/>
 
[[Category:Regression analysis]]