General linear model: Difference between revisions

Content deleted Content added
m The indexing for the regression coefficients here uses the variable k, but the description uses the variable j. I changed the description to k so that it matches the example equation
Citation bot (talk | contribs)
Altered doi-broken-date. | Use this bot. Report bugs. | #UCB_CommandLine
Line 31:
 
== Comparison to generalized linear model ==
The general linear model and the [[generalized linear model]] (GLM)<ref name=":0">{{Cite book |last1=McCullagh |first1=P. |author1-link=Peter McCullagh |last2=Nelder |first2=J. A. |author2-link=John Nelder |date=January 1, 1983 |chapter=An outline of generalized linear models |title=Generalized Linear Models |pages=21–47 |publisher=Springer US |isbn=9780412317606 |doi=10.1007/978-1-4899-3242-6_2 |doi-broken-date=1312 DecemberJuly 20242025}}</ref><ref>Fox, J. (2015). ''Applied regression analysis and generalized linear models''. Sage Publications.</ref> are two commonly used families of [[Statistics|statistical methods]] to relate some number of continuous and/or categorical [[Dependent and independent variables|predictors]] to a single [[Dependent and independent variables|outcome variable]].
 
The main difference between the two approaches is that the general linear model strictly assumes that the [[Errors and residuals|residuals]] will follow a [[Conditional probability distribution|conditionally]] [[normal distribution]],<ref name=":1">{{cite report |last1=Cohen |first1=J. |last2=Cohen |first2=P. |last3=West |first3=S. G. |last4=Aiken |first4=L. S. |author4-link=Leona S. Aiken |date=2003 |title=Applied multiple regression/correlation analysis for the behavioral sciences}}</ref> while the GLM loosens this assumption and allows for a variety of other [[Distribution (mathematics)|distributions]] from the [[exponential family]] for the residuals.<ref name=":0"/> The general linear model is a special case of the GLM in which the distribution of the residuals follow a conditionally normal distribution.