Functional regression: Difference between revisions

Content deleted Content added
Ms.chen (talk | contribs)
No edit summary
OAbot (talk | contribs)
m Open access bot: url-access updated in citation with #oabot.
 
(179 intermediate revisions by 17 users not shown)
Line 1:
{{Short description|Type of regression analysis}}
{{User sandbox}}
'''Functional regression''' is ana extensionversion of the [[Regressionregression analysis|traditional multivariate regression]] with scalarwhen [[Dependent and independent variables|responses]] and scalaror [[Dependent and independent variables|covariates]], which allows one to conduct regression analysis oninclude [[Functional data analysis|functional data]]. One the one hand, functionalFunctional regression models can be classified into threefour types baseddepending on whether the responses or covariates are functional or scalar: (i) scalar responses with functional covariates, (ii) functional responses with scalar covariates, (iii) functional responses with functional covariates, and (iv) scalar or functional responses with functional and scalar covariates. On the otherIn handaddition, functional regression models can be [[Linear regression|linear]], partially linear, or [[Nonlinear regression|nonlinear]]. In particular, functional polynomial models, functional [[Semiparametric_regressionSemiparametric regression#Index_modelsIndex models|single and multiple singleindex models]] and functional [[Additive model|additive modelsmodel]]s are three special cases of functional nonlinear models.
<!-- EDIT BELOW THIS LINE -->
'''Functional regression''' is an extension of the [[Regression analysis|traditional multivariate regression]] with scalar [[Dependent and independent variables|responses]] and scalar [[Dependent and independent variables|covariates]], which allows one to conduct regression analysis on [[Functional data analysis|functional data]]. One the one hand, functional regression models can be classified into three types based on whether the responses or covariates are functional or scalar: (i) scalar responses with functional covariates, (ii) functional responses with scalar covariates, (iii) functional responses with functional covariates, and (iv) scalar or functional responses with functional and scalar covariates. On the other hand, functional regression models can be [[Linear regression|linear]], partially linear, or [[Nonlinear regression|nonlinear]]. In particular, functional polynomial models, functional [[Semiparametric_regression#Index_models|single and multiple single models]] and functional [[Additive model|additive models]] are three special cases of functional nonlinear models.
 
__TOC__
 
== Functional linear models (FLMs) ==
Functional linear models (FLMs) are an extension of [[Linear regression|traditional multivariate linear models]] (LMs). A linear model with scalar response <math>Y\in\mathbb{R}</math> and scalar covariates <math>\mathbf{X}\in\mathbb{R}^p</math>, which can be written as
{{NumBlk|::|<math display="block">Y = \beta_0 + \langle\mathbf{ X},\beta\rangle + \epsilonvarepsilon,</math>|{{EquationRef|1}}}}
<br />
where <math>\langle\cdot,\cdot\rangle</math> denotes the [[Inner product space|inner product]] in [[Euclidean space|Euclidean space]], <math>\beta_0\in\mathbb{R}</math> and <math>\beta\in\mathbb{R}^p</math> denote the regression coefficients, and <math>\epsilonvarepsilon</math> is a random error with [[Expected value|mean]] zero and finite [[Variance|variance]] finite. FLMs can be divided into threetwo types based on the responses and covariates.
<math>Y = \beta_0 + \langle\mathbf{X},\beta\rangle + \epsilon</math>
<br />
where <math>\langle\cdot,\cdot\rangle</math> denotes the [[Inner product space|inner product]] in [[Euclidean space|Euclidean space]], <math>\beta_0\in\mathbb{R}</math> and <math>\beta\in\mathbb{R}^p</math> denote the regression coefficients, and <math>\epsilon</math> is a random error with [[Expected value|mean]] zero and [[Variance|variance]] finite. FLMs can be divided into three types based on responses and covariates.
 
=== Functional linear models with scalar responseresponses ===
Functional linear models with scalar response (also known as <a href="/wiki/Generalized_functional_linear_model" title="Generalized functional linear model">functional linear regression (FLR)</a>)responses can be givenobtained by replacing the scalar covariates $\mathbf{<math>X}$</math> and the coefficient vector $<math>\beta$</math> in the traditional multivariate linear model ({{EquationNote|1}}) by a centered functional covariate $<math>X^c(t\cdot) = X(t\cdot) - \mathbb{E}(X(t\cdot))$</math> and a coefficient function $<math>\beta = \beta(t\cdot)$</math> forwith $t\in[[Domain of a function|___domain]] <math>\mathcal{T}$</math>, respectively, and replacing the inner product in Euclidean space by that in [[Hilbert space]] [[Lp space|<math>L^2</math>]],
$${{NumBlk|::|<math display="block">Y = \beta_0 + \langle X^c, \beta\rangle +\epsilonvarepsilon = \beta_0 + \int_\mathcal{T} X^c(t)\beta(t)\,dt + \epsilon$$varepsilon,</math>|{{EquationRef|2}}}}
where $<math>\langle \cdot, \cdot \rangle$</math> here denotes the inner product in $<math>L^2$ space</math>. One approach to estimating $<math>\beta_0$</math> and $<math>\beta(t\cdot)$</math> is to expand the centered covariate $<math>X$^c(\cdot)</math> and the coefficient function $<math>\beta(t\cdot)$</math> onin the same <a href="/wiki/Basis_function" title="[[Basis function">|functional basis</a>]], suchfor asexample, <a href="/wiki/[[B-spline" title="B-spline">B-spline</a>]] basis or the eigenfunctionseigenbasis used in the <a href="/wiki/Karhunen%E2%80%93Lo%C3%A8ve_theorem" title="[[Karhunen&ndash;Lo&egrave;veLoève theorem">|Karhunen&ndash;Lo&egrave;veLoève expansion</a>]]. Suppose $<math>\{\phi_k\}_{k=1}^\infty$</math> is an <a[[orthonormal href="/wiki/Orthonormal_basis"basis]] title="Orthonormalof basis"<math>orthonormal basisL^2</amath> of the functional space. ThenExpanding expansion of $<math>X$^c</math> and $<math>\beta$</math> onin this basis, can be expressed as $<math>X^c(t\cdot) = \sum_{k=1}^\infty x_k \phi_k(t\cdot)$ and</math>, $<math>\beta(t\cdot) = \sum_{k=1}^\infty \beta_k \phi_k(t\cdot)$ respectively. Then the FLR</math>, model is equivalent to the multivariate linear model of the({{EquationNote|2}}) formbecomes
$$<math display="block">Y = \beta_0 + \sum_{k=1}^\infty \beta_k x_k +\epsilon$$varepsilon.</math>
For implementation, regularization is needed and can be done through truncation, <math>L^2</math> penalization or <math>L^1</math> penalization.<ref name=morr:15>{{cite journal|doi=10.1146/annurev-statistics-010814-020413|title=Functional Regression|year=2015|last1=Morris|first1=Jeffrey S.|journal=[[Annual Review of Statistics and Its Application]]|volume=2|issue=1|pages=321–359|arxiv=1406.4068|bibcode=2015AnRSA...2..321M|s2cid=18637009}}</ref> In addition, a [[reproducing kernel Hilbert space]] (RKHS) approach can also be used to estimate <math>\beta_0</math> and <math>\beta(\cdot)</math> in model ({{EquationNote|2}})<ref>Yuan and Cai (2010). "A reproducing kernel Hilbert space approach to functional linear regression". ''The Annals of Statistics''. '''38''' (6):3412&ndash;3444. [[Digital object identifier|doi]]:[http://doi.org/10.1214/09-AOS772 10.1214/09-AOS772].</ref>
where in implementation the infinite sum is replaced by a finite sum truncated at $K$
$$Y = \beta_0 + \sum_{k=1}^K \beta_k x_k +\epsilon$$
where $K\in\mathbb{N}$ is finite<sup id="cite_ref-Wang_1-0" class="reference"><a href="#cite_note-Wang-1">[1]</a></sup>.<br />
Adding multiple functional and scalar covariates, the FLR can be extended as
$$Y = \langle\mathbf{Z},\alpha\rangle + \sum_{j=1}^p \int_{\mathcal{T}_j} X_j^c(t) \beta_j(t) dt + \epsilon$$
where $\mathbf{Z}=(Z_1,\cdots,Z_q)^T$ with $Z_1=1$ is a vector of scalar covariates, $\alpha=(\alpha_1,\cdots,\alpha_q)^T$ is a vector of coefficients corresponding to $\mathbf{Z}$, $\langle\cdot,\cdot\rangle$ denotes the inner product in Euclidean space, $X^c_1,\cdots,X^c_p$ are multiple centered functional covariates given by $X_j^c(\cdot) = X_j(\cdot) - \mathbb{E}(X_j(\cdot))$, and $\mathcal{T}_j$ is the interval $X_j(\cdot)$ is defined on. However, due to the parametric component $\alpha$, the estimation of this model is different from that of the FLR. A possible approach to estimating $\alpha$ is through <a href="/wiki/Generalized_estimating_equation" title="Generalized estimating equation">generalized estimating equation</a> with the nonparametric part $ \sum_{j=1}^p \int_{\mathcal{T}_j} X_j^c(t) \beta_j(t) dt$ replaced by its estimate for a given $\alpha$.<sup id="cite_ref-Hu_2-0" class="reference"><a href="#cite_note-Hu-2">[2]</a></sup> Once $\alpha$ is estimated, one can apply any suitable consistent method to $Y-\langle\mathbf{Z}, \hat\alpha\rangle$ to estimate $\beta_j$s<sup id="cite_ref-Wang_1-1" class="reference"><a href="#cite_note-Wang-1">[1]</a></sup>.<br />
 
Adding multiple functional and scalar covariates, themodel FLR({{EquationNote|2}}) can be extended asto
=== Functional linear models with functional response ===
$${{NumBlk|::|<math display="block">Y = \langle\mathbfsum_{Zk=1},\alpha^q Z_k\ranglealpha_k + \sum_{j=1}^p \int_{\mathcal{T}_j} X_j^c(t) \beta_j(t) \,dt + \epsilon$$varepsilon,</math>|{{EquationRef|3}}}}
For a function $Y(\cdot)$ on $\mathcal{T}_Y$ and a functional covariate $X(\cdot)$ on $\mathcal{T}_X$, two primary models have been considered<sup id="cite_ref-Wang_1-2" class="reference"><a href="#cite_note-Wang-1">[1]</a></sup><sup id="cite_ref-Ramsay_3-0" class="reference"><a href="#cite_note-Ramsay-3">[3]</a></sup>. One functional linear model regressing $Y(\cdot)$ on $X(\cdot)$ is given by
where <math>Z_1,\ldots,Z_q</math> are scalar covariates with <math>Z_1=1</math>, <math>\alpha_1,\ldots,\alpha_q</math> are regression coefficients for <math>Z_1,\ldots,Z_q</math>, respectively, <math>X^c_j</math> is a centered functional covariate given by <math>X_j^c(\cdot) = X_j(\cdot) - \mathbb{E}(X_j(\cdot))</math>, <math>\beta_j</math> is regression coefficient function for <math>X_j^c(\cdot)</math>, and <math>\mathcal{T}_j</math> is the ___domain of <math>X_j</math> and <math>\beta_j</math>, for <math>j=1,\ldots,p</math>. However, due to the parametric component <math>\alpha</math>, the estimation methods for model ({{EquationNote|2}}) cannot be used in this case<ref name=wang:16>{{cite journal|doi=10.1146/annurev-statistics-041715-033624|title=Functional Data Analysis|year=2016|last1=Wang|first1=Jane-Ling|last2=Chiou|first2=Jeng-Min|last3=Müller|first3=Hans-Georg|journal=[[Annual Review of Statistics and Its Application]]|volume=3|issue=1|pages=257–295|bibcode=2016AnRSA...3..257W|url=https://zenodo.org/record/895750|doi-access=free}}</ref> and alternative estimation methods for model ({{EquationNote|3}}) are available.<ref>{{Cite journal |last=Kong |first=Dehan |last2=Xue |first2=Kaijie |last3=Yao |first3=Fang |last4=Zhang |first4=Hao H. |date= |title=Partially functional linear regression in high dimensions |url=https://academic.oup.com/biomet/article-lookup/doi/10.1093/biomet/asv062 |journal=Biometrika |language=en |volume=103 |issue=1 |pages=147–159 |doi=10.1093/biomet/asv062 |issn=0006-3444|url-access=subscription }}</ref><ref>{{Cite journal |last=Hu |first=Z. |date=2004-06-01 |title=Profile-kernel versus backfitting in the partially linear models for longitudinal/clustered data |url=https://academic.oup.com/biomet/article-lookup/doi/10.1093/biomet/91.2.251 |journal=Biometrika |language=en |volume=91 |issue=2 |pages=251–262 |doi=10.1093/biomet/91.2.251 |issn=0006-3444|url-access=subscription }}</ref>
$$Y(s) = \beta_0(s) + \int_{\mathcal{T}_X} \beta(s,t) X^c(t)dt + \epsilon(s)$$
where $s\in\mathcal{T}_Y$, $t\in\mathcal{T}_X$, $X^c(\cdot) = X(\cdot) - \mathbb{E}(X(\cdot))$ is still the centered functional covariate, $\beta_0(\cdot)$ and $\beta(\cdot,\cdot)$ are coefficient functions, and $\epsilon(\cdot)$ is usually assumed to be a Gaussian process with mean zero. In this case, at any given time $s\in\mathcal{T}_Y$, the value of $Y$, i.e. $Y(s)$, depends on the entire trajectory of $X$. This model, for any given time $s$, is an extension of the traditional multivariate linear regression model by simply replacing the inner product in Euclidean space by that in $L^2$ space. Thus, estimation of this model can be given by analogy to multivariate linear regression
$$r_{XY} = R_{XX}\beta, \text{ for } \beta\in L^2(\mathcal{T}_X\times\mathcal{T}_X)$$
where $r_{XY}(s,t) = \text{cov}(X(s),Y(t))$, $R_{XX}: L^2\times L^2 \rightarrow L^2\times L^2$ is defined as $(R_{XX}\beta)(s,t) = \int r_{XX}(s,w)\beta(w,t)dw$ with $r_{XX}(s,t) = \text{cov}(X(s),X(t))$. Furthermore, regularization is needed because $R_{XX}$ is a compact operator and its inverse is not bounded<sup id="cite_ref-Wang_1-3" class="reference"><a href="#cite_note-Wang-1">[1]</a></sup>.<br />
In particular, taking $X(\cdot)$ as a constant function gives a special case of this model
$$Y(s) = \sum_{j=1}^p X_j \beta_j(s) + \epsilon(s)$$
which is a FLM with functional response and scalar covariates.
 
==== ConcurrentFunctional linear models =with functional responses ===
For a functional response <math>Y(\cdot)</math> with ___domain <math>\mathcal{T}</math> and a functional covariate <math>X(\cdot)</math> with ___domain <math>\mathcal{S}</math>, two FLMs regressing <math>Y(\cdot)</math> on <math>X(\cdot)</math> have been considered.<ref name=wang:16/><ref>Ramsay and [[Bernard Silverman|Silverman]] (2005). ''Functional data analysis'', 2nd ed., New York: Springer, {{ISBN|0-387-40080-X}}.</ref> One of these two models is of the form
Assuming that $\mathcal{T}_X = \mathcal{T}_Y := \mathcal{T}$, another model called varying-coefficient model is of the form
{{NumBlk|::|<math display="block">Y(t) = \beta_0(t) + \int_{\mathcal{S}} \beta(s,t) X^c(s)\,ds + \varepsilon(t),\ \text{for}\ t\in\mathcal{T},</math>|{{EquationRef|4}}}}
$$Y(s) = \alpha_0(s) + \alpha(s)X(s)+\epsilon(s)$$
where $s\in\mathcal{T}_Y$, $t\in\mathcal{T}_X$, $<math>X^c(\cdot) = X(\cdot) - \mathbb{E}(X(\cdot))$</math> is still the centered functional covariate, $<math>\beta_0(\cdot)$</math> and $<math>\beta(\cdot,\cdot)$</math> are coefficient functions, and $<math>\epsilonvarepsilon(\cdot)$</math> is usually assumed to be a Gaussianrandom process with mean zero and finite variance. In this case, at any given time $s<math>t\in\mathcal{T}_Y$</math>, the value of $<math>Y$</math>, i.e., $<math>Y(st)$</math>, depends on the entire trajectory of $<math>X$</math>. ThisModel model({{EquationNote|4}}), for any given time $s$<math>t</math>, is an extension of the traditional [[multivariate linear regression]] model by simply replacingwith the inner product in Euclidean space replaced by that in $<math>L^2$ space</math>. Thus,An estimationestimating ofequation this model can be givenmotivated by analogy to multivariate linear regression is
Note that this model assumes the value of $Y$ at time $s$, i.e. $Y(s)$, only depends on that of $X$ at the same time, $X(s)$, and thus is a concurrent regression model. A possible way to estimate $\alpha$ is a two-step procedure: (i) For any $s\in\mathcal{T}$ fixed, an estimate of $\alpha(s)$ can be computed by applying <a href="/wiki/Ordinary_least_squares" title="Ordinary least squares">ordinary least squares</a> to a neighborhood of $s$. Let the corresponding estimate be denoted by $\tilde\alpha(s)$. (ii) The final estimate $\hat\alpha$ is then obtained by smoothing $\tilde\alpha(s)$ with respect to $s$<sup id="cite_ref-Wang_1-4" class="reference"><a href="#cite_note-Wang-1">[1]</a></sup>.
$$<math display="block">r_{XY} = R_{XX}\beta, \text{ for } \beta\in L^2(\mathcal{TS}_X\times\mathcal{TS}_X)$$,</math>
where <math>r_{XY}(s,t) = \text{cov}(X(s),Y(t))</math>, <math>R_{XX}: L^2(\mathcal{S}\times\mathcal{S}) \rightarrow L^2(\mathcal{S}\times\mathcal{T})</math> is defined as <math>(R_{XX}\beta)(s,t) = \int_\mathcal{S} r_{XX}(s,w)\beta(w,t)dw</math> with <math>r_{XX}(s,w) = \text{cov}(X(s),X(w))</math> for <math>s,w\in\mathcal{S}</math>.<ref name=wang:16/> Regularization is needed and can be done through truncation, <math>L^2</math> penalization or <math>L^1</math> penalization.<ref name=morr:15/> Various estimation methods for model ({{EquationNote|4}}) are available.<ref>{{Cite journal |last=Ramsay |first=J. O. |last2=Dalzell |first2=C. J. |date=1991 |title=Some Tools for Functional Data Analysis |url=https://www.jstor.org/stable/2345586 |journal=Journal of the Royal Statistical Society. Series B (Methodological) |volume=53 |issue=3 |pages=539–572 |issn=0035-9246}}</ref><ref>{{Cite journal |last=Yao |first=Fang |last2=Müller |first2=Hans-Georg |last3=Wang |first3=Jane-Ling |date= |title=Functional linear regression analysis for longitudinal data |url=https://projecteuclid.org/journals/annals-of-statistics/volume-33/issue-6/Functional-linear-regression-analysis-for-longitudinal-data/10.1214/009053605000000660.full |journal=The Annals of Statistics |volume=33 |issue=6 |pages=2873–2903 |doi=10.1214/009053605000000660 |issn=0090-5364|arxiv=math/0603132 }}</ref><br />
When <math>X</math> and <math>Y</math> are concurrently observed, i.e., <math>\mathcal{S}=\mathcal{T}</math>,<ref>{{Cite journal |last=Grenander |first=Ulf |date= |title=Stochastic processes and statistical inference |url=https://projecteuclid.org/journals/arkiv-for-matematik/volume-1/issue-3/Stochastic-processes-and-statistical-inference/10.1007/BF02590638.full |journal=Arkiv för Matematik |volume=1 |issue=3 |pages=195–277 |doi=10.1007/BF02590638 |issn=0004-2080}}</ref> it is reasonable to consider a historical functional linear model, where the current value of <math>Y</math> only depends on the history of <math>X</math>, i.e., <math>\beta(s,t)=0</math> for <math>s>t</math> in model ({{EquationNote|4}}).<ref name=wang:16/><ref>{{Cite journal |last=Malfait |first=Nicole |last2=Ramsay |first2=James O. |date=2003 |title=The historical functional linear model |url=https://onlinelibrary.wiley.com/doi/10.2307/3316063 |journal=Canadian Journal of Statistics |language=en |volume=31 |issue=2 |pages=115–128 |doi=10.2307/3316063 |issn=1708-945X|url-access=subscription }}</ref> A simpler version of the historical functional linear model is the functional concurrent model (see below).<br />
Adding multiple functional covariates, model ({{EquationNote|4}}) can be extended to
{{NumBlk|::|<math display="block">Y(t) = \beta_0(t) + \sum_{j=1}^p\int_{\mathcal{S}_j} \beta_j(s,t) X^c_j(s)\,ds + \varepsilon(t),\ \text{for}\ t\in\mathcal{T},</math>|{{EquationRef|5}}}}
where for <math>j=1,\ldots,p</math>, <math>X_j^c(\cdot)=X_j(\cdot) - \mathbb{E}(X_j(\cdot))</math> is a centered functional covariate with ___domain <math>\mathcal{S}_j</math>, and <math>\beta_j(\cdot,\cdot)</math> is the corresponding coefficient function with the same ___domain, respectively.<ref name=wang:16/> In particular, taking <math>X_j(\cdot)</math> as a constant function yields a special case of model ({{EquationNote|5}})
<math display="block">Y(t) = \sum_{j=1}^p X_j \beta_j(t) + \varepsilon(t),\ \text{for}\ t\in\mathcal{T},</math>
which is a FLM with functional responseresponses and scalar covariates.
 
==== Functional linearconcurrent models with functional response ====
Assuming that $<math>\mathcal{TS}_X = \mathcal{T}_Y</math>, :=another \mathcal{T}$model, anotherknown as the functional concurrent model, sometimes also referred to as calledthe varying-coefficient model, is of the form
{{NumBlk|::|<math display="block">Y(t) = \alpha_0(t) + \alpha(t)X(t)+\varepsilon(t),\ \text{for}\ t\in\mathcal{T},</math>|{{EquationRef|6}}}}
where <math>\alpha_0</math> and <math>\alpha</math> are coefficient functions. Note that model ({{EquationNote|6}}) assumes the value of <math>Y</math> at time <math>t</math>, i.e., <math>Y(t)</math>, only depends on that of <math>X</math> at the same time, i.e., <math>X(t)</math>. Various estimation methods can be applied to model ({{EquationNote|6}}).<ref>{{Cite journal |last=Fan |first=Jianqing |last2=Zhang |first2=Wenyang |date= |title=Statistical estimation in varying coefficient models |url=https://projecteuclid.org/journals/annals-of-statistics/volume-27/issue-5/Statistical-estimation-in-varying-coefficient-models/10.1214/aos/1017939139.full |journal=The Annals of Statistics |volume=27 |issue=5 |pages=1491–1518 |doi=10.1214/aos/1017939139 |issn=0090-5364}}</ref><ref>{{Cite journal |last=Huang |first=Jianhua Z. |last2=Wu |first2=Colin O. |last3=Zhou |first3=Lan |date=2004 |title=Polynomial Spline Estimation and Inference for Varying Coefficient Models with Longitudinal Data |url=https://www.jstor.org/stable/24307415 |journal=Statistica Sinica |volume=14 |issue=3 |pages=763–788 |issn=1017-0405}}</ref><ref>{{Cite journal |last=Şentürk |first=Damla |last2=Müller |first2=Hans-Georg |date=2010-09-01 |title=Functional Varying Coefficient Models for Longitudinal Data |url=https://www.tandfonline.com/doi/abs/10.1198/jasa.2010.tm09228 |journal=Journal of the American Statistical Association |doi=10.1198/jasa.2010.tm09228 |issn=0162-1459|url-access=subscription }}</ref><br />
Adding multiple functional covariates, model ({{EquationNote|6}}) can also be extended to
<math display="block">Y(t) = \alpha_0(t) + \sum_{j=1}^p\alpha_j(t)X_j(t)+\varepsilon(t),\ \text{for}\ t\in\mathcal{T},</math>
where <math>X_1,\ldots,X_p</math> are multiple functional covariates with ___domain <math>\mathcal{T}</math> and <math>\alpha_0,\alpha_1,\ldots,\alpha_p</math> are the coefficient functions with the same ___domain.<ref name=wang:16/>
 
== Functional nonlinear models ==
=== Functional polynomial models ===
Functional polynomial models isare an extension of the FLMs with scalar responses, analogous to extending multivariate linear modelsregression to [[polynomial onesregression]]. For a scalar response $<math>Y$</math> and a functional covariate $<math>X(\cdot)$</math> definedwith on an___domain interval $<math>\mathcal{T}$</math>, athe simplest example of functional polynomial models is functional quadratic regression<supref idname="cite_ref-Yao_5-0yao:10">{{Cite classjournal |last="reference"><aYao href|first="#cite_noteF. |last2=Muller |first2=H.-YaoG. |date=2010-5">[5]<03-01 |title=Functional quadratic regression |url=https:/a>/academic.oup.com/biomet/article-lookup/doi/10.1093/biomet/asp069 |journal=Biometrika |language=en |volume=97 |issue=1 |pages=49–64 |doi=10.1093/biomet/asp069 |issn=0006-3444|url-access=subscription }}</supref>
$$<math display="block">Y = \alpha + \int_\mathcal{T}\beta(t)X^c(t)\,dt + \int_\mathcal{T} \int_\mathcal{T} \gamma(s,t) X^c(s)X^c(t) dsdt\,ds\,dt + \epsilon$$varepsilon,</math>
where $<math>X^c(\cdot) = X(\cdot) - \mathbb{E}(X(\cdot))$</math> is the centered functional covariate, $<math>\alpha$</math> is a scalar coefficient, $<math>\beta(\cdot)$</math> and $<math>\gamma(\cdot,\cdot)$</math> are coefficient functions definedwith ondomains $<math>\mathcal{T}$</math> and $<math>\mathcal{T}\times\mathcal{T}$</math>, respectively, and $<math>\epsilon$varepsilon</math> is a random error with mean zero and variance finite variance. By analogy to FLMs with scalar responses, estimation of functional polynomial models can be obtained through expanding both the centered covariate $<math>X^c$</math> and the coefficient functions $<math>\beta$</math> and $<math>\gamma$</math> onin an orthonormal basis.<ref Then the model can be equivalently written as multivariate polynomial regression and thus the corresponding estimation is straightforward.name=yao:10/>
 
=== Functional single and multiple index models ===
A functional multiple index model is given by
$$<math display="block">Y = g\left(\int_{\mathcal{T}} X^c(t) \beta_1(t)\,dt, \cdotsldots, \int_{\mathcal{T}} X^c(t) \beta_p(t)\,dt \right) + \epsilonvarepsilon.$$</math>
Taking $<math>p=1$</math> yields a functional single index model. However, for <math>p>1</math>, this model is problematic due to <a href="/wiki/Curse_of_dimensionality" title="Curse of dimensionality">[[curse of dimensionality</a>]]. InWith other words, with $<math>p>1$</math> and relatively small sample sizes, the estimator given by this model often leadshas tolarge highvariance.<ref variabilityname="chen:11">{{Cite ofjournal the|last=Chen estimator<sup|first=Dong id|last2="cite_refHall |first2=Peter |last3=Müller |first3=Hans-Chen_4-0"Georg class|date="reference"><a href|title="#cite_noteSingle and multiple index functional regression models with nonparametric link |url=https://projecteuclid.org/journals/annals-Chenof-4">[4]<statistics/a><volume-39/sup>issue-3/Single-and-multiple-index-functional-regression-models-with-nonparametric-link/10.1214/11-AOS882.full Alternatively,|journal=The aAnnals preferableof $Statistics |volume=39 |issue=3 |pages=1720–1747 |doi=10.1214/11-AOS882 |issn=0090-5364|arxiv=1211.5018 }}</ref> An alternative <math>p$</math>-component functional multiple index model can be formedexpressed as
$$<math display="block">Y = g_1\left(\int_{\mathcal{T}} X^c(t) \beta_1(t)\,dt\right)+ \cdots+ g_p\left(\int_{\mathcal{T}} X^c(t) \beta_p(t)\,dt \right) + \epsilonvarepsilon.$$</math>
Estimation methods for functional single and multiple index models are available.<ref name=chen:11/><ref>{{Cite journal |last=Jiang |first=Ci-Ren |last2=Wang |first2=Jane-Ling |date= |title=Functional single index models for longitudinal data |url=https://projecteuclid.org/journals/annals-of-statistics/volume-39/issue-1/Functional-single-index-models-for-longitudinal-data/10.1214/10-AOS845.full |journal=The Annals of Statistics |volume=39 |issue=1 |pages=362–388 |doi=10.1214/10-AOS845 |issn=0090-5364|arxiv=1103.1726 }}</ref>
 
=== Functional additive models (FAMs) ===
Given an expansion of a functional covariate $<math>X$</math> onwith ___domain <math>\mathcal{T}</math> in an orthonormal basis $<math>\{\phi_k\}_{k=1}^\infty$</math>: $<math>X(t) = \sum_{k=1}^\infty x_k \phi_k(t)$</math>, a functional linear model with scalar responseresponses asshown statedin beforemodel ({{EquationNote|2}}) can be written as
$$<math display="block">\mathbb{E}(Y|X)=\mathbb{E}(Y) + \sum_{k=1}^\infty \beta_k x_k.$$</math>
AOne functionalform additiveof modelFAMs canis be givenobtained by replacing the linear function of $<math>x_k</math>, i.e., <math>\beta_k x_k$</math>, by a general smooth function $<math>f_k$</math>,
$$<math display="block">\mathbb{E}(Y|X)=\mathbb{E}(Y) + \sum_{k=1}^\infty f_k(x_k)$$,</math>
where <math>f_k</math> satisfies <math>\mathbb{E}(f_k(x_k))=0</math> for <math>k\in\mathbb{N}</math>.<ref name=wang:16/><ref>{{Cite journal |last=Müller |first=Hans-Georg |last2=Yao |first2=Fang |date=2008-12-01 |title=Functional Additive Models |url=https://www.tandfonline.com/doi/abs/10.1198/016214508000000751 |journal=Journal of the American Statistical Association |doi=10.1198/016214508000000751 |issn=0162-1459|url-access=subscription }}</ref> Another form of FAMs consists of a sequence of time-additive models:
where $f_k$ satisfies $\mathbb{E}(f_k(x_k))=0$ for $k\in\mathbb{N}$<sup id="cite_ref-Wang_1-5" class="reference"><a href="#cite_note-Wang-1">[1]</a></sup>.
<math display="block">\mathbb{E}(Y|X(t_1),\ldots,X(t_p))=\sum_{j=1}^p f_j(X(t_j)),</math>
where <math>\{t_1,\ldots,t_p\}</math> is a dense grid on <math>\mathcal{T}</math> with increasing size <math>p\in\mathbb{N}</math>, and <math>f_j(x) = g(t_j,x)</math> with <math>g</math> a smooth function, for <math>j=1,\ldots,p</math><ref name=wang:16/><ref>{{Cite journal |last=Fan |first=Yingying |last2=James |first2=Gareth M. |last3=Radchenko |first3=Peter |date= |title=Functional additive regression |url=https://projecteuclid.org/journals/annals-of-statistics/volume-43/issue-5/Functional-additive-regression/10.1214/15-AOS1346.full |journal=The Annals of Statistics |volume=43 |issue=5 |pages=2296–2325 |doi=10.1214/15-AOS1346 |issn=0090-5364|arxiv=1510.04064 }}</ref>
 
== Extensions ==
A direct extension of functional linear modelsFLMs with scalar responseresponses shown in model ({{EquationNote|2}}) is to add a link function to create a <a href="/wiki/Generalized_functional_linear_model" title="Generalized functional linear model">[[generalized functional linear model</a>]] (GFLM) by analogy to extending <a href="/wiki/Linear_regression" title="Linear regression">[[linear regression</a>]] to <a href="/wiki/Generalized_linear_model" title="[[Generalized linear model">|generalized linear regression</a>]] (GLM), of which the three components are:
$$Y=g# Linear predictor <math>\left(eta = \beta_0 + \int_{\mathcal{T}} X^c(t)\beta(t)dt\right) +\epsilon$$,dt</math>;
# [[Variance function]] <math>\text{Var}(Y|X) = V(\mu)</math>, where <math>\mu = \mathbb{E}(Y|X)</math> is the [[Conditional expectation|conditional mean]];
where $g$ is a pre-specific link function.
# Link function <math>g</math> connecting the conditional mean and the linear predictor through <math>\mu=g(\eta)</math>.
 
== See also ==
* [[Functional data analysis|]]
* [[Functional dataprincipal component analysis]]
* [[Karhunen&ndash;Loève theorem]]
* [[Functional principal component analysis|Functional principal component analysis]]
* [[Generalized linear model|Generalizedfunctional linear model]]
* [[Karhunen&ndash;Lo&egrave;ve_theorem|Karhunen&ndash;Lo&egrave;ve theorem]]
* [[Stochastic processes|Stochastic processes]]
* [[Generalized linear model|Generalized linear model]]
* [[Generalized functional linear model|Generalized functional linear model]]
* [[Stochastic processes|Stochastic processes]]
* [[Lp space|Lp space]]
 
== Further reading ==
* Morris (2015). Functional regression. ''Annual Review of Statistics and Its Application''. '''2''':321&ndash;359. [[Digital object identifier|doi]]:[http://doi.org/10.1146/annurev-statistics-010814-020413 10.1146/annurev-statistics-010814-020413].
 
== References ==
<references/>
<ref>Wang, Chiou and M&uuml;ller (2016). Functional data analysis. ''Annual Review of Statistics and Its Application''. '''3''':257&ndash;295.</ref>
 
[[Category:Regression analysis]]