Comparison of Gaussian process software: Difference between revisions

Content deleted Content added
Joanico (talk | contribs)
Comparison table: added Shteno
Citation bot (talk | contribs)
Add: article-number, bibcode. Removed URL that duplicated identifier. Removed access-date with no URL. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox | #UCB_webform_linked 479/1032
 
(41 intermediate revisions by 13 users not shown)
Line 1:
{{short description|Comparison of statistical analysis software that allows doing inference with Gaussian processes}}
This is a comparison of statistical analysis software that allows doing inference with [[Gaussian process|Gaussian processes]]es often using [[Gaussian process approximations|approximations]].
 
This article is written from the point of view of [[Bayesian statistics]], which may use a terminology different from the one commonly used in [[kriging]]. The next section should clarify the mathematical/computational meaning of the information provided in the table independently of contextual terminology.
Line 14:
* '''Exact''': whether ''generic'' exact algorithms are implemented. These algorithms are usually appropriate only up to some thousands of datapoints.
* '''Specialized''': whether specialized ''exact'' algorithms for specific classes of problems are implemented. Supported specialized algorithms may be indicated as:
** ''Kronecker'': algorithms for separable kernels on grid data.<ref name="gilboa2015">< /ref>
** ''Toeplitz'': algorithms for stationary kernels on uniformly spaced data.<ref name="zhang2005">< /ref>
** ''Semisep.'': algorithms for semiseparable covariance matrices.<ref name="foreman2017">< /ref>
** ''Sparse'': algorithms optimized for [[Sparse matrix|sparse]] covariance matrices.
** ''Block'': algorithms optimized for [[Block_matrixBlock matrix#Block_diagonal_matricesBlock diagonal matrices|block diagonal]] covariance matrices.
** ''Markov'': algorithms for kernels which represent (or can be formulated as) a Markov process.<ref name="sarkka2013">< /ref>
* '''Approximate''': whether ''generic or specialized'' approximate algorithms are implemented. Supported approximate algorithms may be indicated as:
** ''Sparse'': algorithms based on choosing a set of "inducing points" in input space,<ref name="candela2005">< /ref> or more in general imposing a sparse structure on the inverse of the covariance matrix.
** ''Hierarchical'': algorithms which approximate the covariance matrix with a [[hierarchical matrix]].<ref name="ambikasaran2016">< /ref>
 
=== Input ===
Line 29:
 
* '''ND''': whether multidimensional input is supported. If it is, multidimensional output is always possible by adding a dimension to the input, even without direct support.
* '''Non-real''': whether arbitrary non-[[Real numbers|real]] input is supported (for example, text or [[Complexcomplex number|complex numbers]]s).
 
=== Output ===
Line 45:
These columns are about finding values of variables which enter somehow in the definition of the specific problem but that can not be inferred by the Gaussian process fit, for example parameters in the formula of the kernel.
 
* '''Prior''': whether specifying arbitrary [[Hyperprior|hyperpriorshyperprior]]s on the [[Hyperparameter (Bayesian statistics)|hyperparametershyperparameter]]s is supported.
* '''Posterior''': whether estimating the posterior is supported beyond [[point estimation]], possibly in conjunction with other software.
 
Line 59:
 
== Comparison table ==
{{sort-under}}
 
{| class="wikitable sortable sort-under" style="font-size: 90%; text-align: center; width: auto;"
|-
! rowspan="2" | Name
Line 68:
! colspan="2" | Input
! colspan="2" | Output
! colspan="2" | [[Hyperparameter (Bayesian statistics)|Hyperparameters]]
! colspan="3" | [[Linear transformations]]
! rowspan="2" | Name
|-
! Exact
! {{verth|Specialized}}
! {{verth|Approxi&shy;mate}}
! Approximate
! ND
! {{nowrap}} verth| Non-real}}
! Likelihood
! Errors
! Prior
! Posterior
! {{verth|Derivative}}
! Deriv.
! Finite
! Sum
|-
! [[PyMC3PyMC]]
| {{free|[[Apache License|Apache]]}}
| [[Python (programming language)|Python]]
Line 100:
| {{yes}}
| {{yes}}
! [[PyMC3PyMC]]
|-
! [[Stan (software)|Stan]]
Line 136:
! [[scikit-learn]]
|-
! [http://www.cs.toronto.edu/%7Eradford/fbm.software.html fbm]<br/><ref name="vanhatalo2013">< /ref>
| {{free}}
| [[C (programming language)|C]]
Line 153:
! [http://www.cs.toronto.edu/%7Eradford/fbm.software.html fbm]
|-
! [http://www.gaussianprocess.org/gpml/code/matlab/doc/index.html GPML]<br/><ref name="rasmussen2010">< /ref><ref name="vanhatalo2013">< /ref>
| {{BSD-lic}}
| [[MATLAB]]
Line 170:
! [http://www.gaussianprocess.org/gpml/code/matlab/doc/index.html GPML]
|-
! [https://research.cs.aalto.fi/pml/software/gpstuff/ GPstuff]<br/><ref name="vanhatalo2013">< /ref>
| {{GPL-lic}}
| [[MATLAB]], [[R (programming language)|R]]
| {{yes}}
| {{yes|Sparse, Markov}}
| {{yes|Sparse}}
| {{yes|ND}}
Line 187:
! [https://research.cs.aalto.fi/pml/software/gpstuff/ GPstuff]
|-
! [https://sheffieldml.github.io/GPy/ GPy]<br/><ref name="matthews2017">< /ref>
| {{BSD-lic}}
| [[Python (programming language)|Python]]
Line 204:
! [https://sheffieldml.github.io/GPy/ GPy]
|-
! [https://www.gpflow.org GPflow]<br/><ref name="matthews2017">< /ref>
| {{free|[[Apache License|Apache]]}}
| [[Python (programming language)|Python]]
Line 221:
! [https://www.gpflow.org GPflow]
|-
! [https://gpytorch.ai GPyTorch]<br/><ref name="gardner2018">< /ref>
| {{free|[[MIT License|MIT]]}}
| [[Python (programming language)|Python]]
Line 238:
! [https://gpytorch.ai GPyTorch]
|-
! [https://CRAN.R-project.org/package=GPvecchia GPvecchia]<br/><ref name="zilber2021">< /ref>
| {{GPL-lic}}
| [[R (programming language)|R]]
| {{yes}}
| {{no}}
| {{yes|Sparse, HierarchicalHierarch&shy;ical}}
| {{yes|ND}}
| {{no}}
Line 255:
! [https://CRAN.R-project.org/package=GPvecchia GPvecchia]
|-
! [https://github.com/marionmari/pyGPs pyGPs]<br/><ref name="neumann2015">< /ref>
| {{BSD-lic}}
| [[Python (programming language)|Python]]
Line 272:
! [https://github.com/marionmari/pyGPs pyGPs]
|-
! [https://CRAN.R-project.org/package=gptk gptk]<br/><ref name="kalaitzis2011">< /ref>
| {{BSD-lic}}
| [[R (programming language)|R]]
Line 289:
! [https://CRAN.R-project.org/package=gptk gptk]
|-
! [https://celerite.readthedocs.io/en/stable/ celerite]<br/><ref name="foreman2017">< /ref>
| {{free|[[MIT License|MIT]]}}
| [[Python (programming language)|Python]], [[Julia (programming language)|Julia]], [[C++]]
| {{no}}
| {{yes|Semisep.}}{{efn|name=celerite|celerite implements only a specific subalgebra of kernels which can be solved in <math>O(n)</math>.<ref name="foreman2017">< /ref>}}
| {{no}}
| {{no|1D}}
Line 306:
! [https://celerite.readthedocs.io/en/stable/ celerite]
|-
! [http://george.readthedocs.io george]<br/><ref name="ambikasaran2016">< /ref>
| {{free|[[MIT License|MIT]]}}
| [[Python (programming language)|Python]], [[C++]]
| {{yes}}
| {{no}}
| {{yes|HierarchicalHierarch&shy;ical}}
| {{yes|ND}}
| {{no}}
Line 323:
! [http://george.readthedocs.io george]
|-
! [https://github.com/google/neural-tangents neural-tangents]<br/><ref name="novak2020">< /ref>{{efn|neural-tangents is a specialized package for infinitely wide neural networks.}}
| {{free|[[Apache License|Apache]]}}
| [[Python (programming language)|Python]]
Line 340:
! [https://github.com/google/neural-tangents neural-tangents]
|-
! [https://cran.r-project.org/package=DiceKriging DiceKriging]<br/><ref name="roustant2012">< /ref>
| {{GPL-lic}}
| [[R (programming language)|R]]
Line 357:
! [https://cran.r-project.org/package=DiceKriging DiceKriging]
|-
! [https://openturns.github.io/www/ OpenTURNS]<br/><ref name="baudin2015">< /ref>
| {{LGPL-lic}}
| [[Python (programming language)|Python]], [[C++]]
Line 374:
! [https://openturns.github.io/www/ OpenTURNS]
|-
! [http://www.uqlab.com/ UQLab]<br/><ref name="marelli2014">< /ref>
| {{proprietary}}
| [[MATLAB]]
Line 391:
! [http://www.uqlab.com/ UQLab]
|-
! [httphttps://www.sumo.intecilabt.ugentimec.be/ooDACEhome/software/oodace ooDACE] <ref name="couckuyt2014">< /ref>
| {{proprietary}}
| [[MATLAB]]
Line 406:
| {{no}}
| {{no}}
! [httphttps://www.sumo.intecilabt.ugentimec.be/ooDACEhome/software/oodace ooDACE]
|-
! [http://www.omicron.dk/dace.html DACE]
Line 485:
| {{no}}
| {{no|Gaussian}}
| {{noyes}}
| {{noyes}}
| {{noyes}}
| {{noyes}}
| {{no}}
| {{no}}
Line 543:
| {{yes}}
! [https://celerite2.readthedocs.io/en/latest/ celerite2]
|-
! [https://smt.readthedocs.io/en/latest/ SMT]<br/><ref name="saves2024" /><ref name="bouhlel2019" />
| {{free|[[BSD licenses|BSD]]}}
| [[Python (programming language)|Python]]
| {{yes}}
| {{no}}
| {{yes|Sparse, PODI{{efn|name=PODI| PODI (Proper Orthogonal Decomposition + Interpolation) is an approximation for high-dimensional multioutput regressions. The regression function is lower-dimensional than the outcomes, and the subspace is chosen with the PCA of the (outcome, dependent variable) data. Each principal component is modeled with an a priori independent Gaussian process.<ref name="Porrello24" />}}, other}}
| {{yes|ND}}
| {{no}}
| {{no|Gaussian}}
| {{partial|i.i.d.}}
| {{partial|Some}}
| {{partial|Some}}
| {{partial|First}}
| {{no}}
| {{no}}
! [https://smt.readthedocs.io/en/latest/ SMT]
|-
! [https://gpjax.readthedocs.io/en/latest/ GPJax]
Line 561 ⟶ 578:
! [https://gpjax.readthedocs.io/en/latest/ GPJax]
|-
! [https://github.com/wesselb/stheno ShtenoStheno]
| {{free|[[MIT License|MIT]]}}
| [[Python (programming language)|Python]]
Line 573 ⟶ 590:
| {{partial|Manually}}
| {{partial|Manually}}
| {{partial|ApproximateApproxi&shy;mate}}
| {{no}}
| {{yes}}
! [https://github.com/wesselb/stheno ShtenoStheno]
|-
! [https://docs.rs/egobox-gp/latest/egobox_gp/ Egobox-gp]<br/><ref name="Lafage2022" />
| {{free|[[Apache License|Apache]]}}
| [[Rust (programming language)|Rust]]
| {{yes}}
| {{no}}
! [https://github.com/wesselb/stheno Shteno]
| {{yes|Sparse}}
| {{yes|ND}}
| {{no}}
| {{no|Gaussian}}
| {{partial|i.i.d.}}
| {{no}}
| {{partial|MAP}}
| {{partial|First}}
| {{no}}
| {{no}}
! [https://docs.rs/egobox-gp/latest/egobox_gp/ Egobox-gp]
|-
! rowspan="2" | Name
Line 582 ⟶ 616:
! rowspan="2" | [[Programming language|Language]]
! Exact
! {{verth|Specialized}}
! {{verth|Approxi&shy;mate}}
! Approximate
! ND
! {{nowrap}} verth| Non-real}}
! Likelihood
! Errors
! Prior
! Posterior
! {{verth|Derivative}}
! Deriv.
! Finite
! Sum
Line 608 ⟶ 642:
{{reflist|refs=
 
<ref name="foreman2017">{{cite journal |last1=Foreman-Mackey |first1=Daniel |last2=Angus |first2=Ruth |last3=Agol |first3=Eric |last4=Ambikasaran |first4=Sivaram |s2cid=88521913 |title=Fast and Scalable Gaussian Process Modeling with Applications to Astronomical Time Series |journal=The Astronomical Journal |date=9 November 2017 |volume=154 |issue=6 |page=220 |doi=10.3847/1538-3881/aa9332|arxiv=1703.09710 |bibcode=2017AJ....154..220F |doi-access=free }}</ref>
 
<ref name="gilboa2015">{{cite journal |last1=P. Cunningham |first1=John |last2=Gilboa |first2=Elad |last3=Saatçi |first3=Yunus |s2cid=6878550 |title=Scaling Multidimensional Inference for Structured Gaussian Processes |journal=IEEE Transactions on Pattern Analysis and Machine Intelligence |date=Feb 2015 |volume=37 |issue=2 |pages=424–436 |doi=10.1109/TPAMI.2013.192|pmid=26353252 |arxiv=1209.4120 |bibcode=2015ITPAM..37..424G }}</ref>
 
<ref name="zhang2005">{{cite journalbook |last1=Leith |first1=D. J. |last2=Zhang |first2=Yunong |last3=Leithead |first3=W. E. |s2cidtitle=13627455Proceedings of the 44th IEEE Conference on Decision and Control |titlechapter=Time-series Gaussian Process Regression Based on Toeplitz Computation of O(N2) Operations and O(N)-level Storage |journals2cid=Proceedings of the 44th IEEE Conference on Decision and Control13627455 |date=2005 |pages=3711–3716 |doi=10.1109/CDC.2005.1582739|isbn=0-7803-9567-0 }}</ref>
 
<ref name="candela2005">{{cite journal |last1=Quiñonero-Candela |first1=Joaquin |last2=Rasmussen |first2=Carl Edward |title=A Unifying View of Sparse Approximate Gaussian Process Regression |journal=Journal of Machine Learning Research |date=5 December 2005 |volume=6 |pages=1939–1959 |url=http://www.jmlr.org/papers/v6/quinonero-candela05a.html |accessdate=23 May 2020}}</ref>
Line 618 ⟶ 652:
<ref name="ambikasaran2016">{{cite journal |last1=Ambikasaran |first1=S. |last2=Foreman-Mackey |first2=D. |last3=Greengard |first3=L. |last4=Hogg |first4=D. W. |last5=O’Neil |first5=M. |s2cid=15206293 |title=Fast Direct Methods for Gaussian Processes |journal=IEEE Transactions on Pattern Analysis and Machine Intelligence |date=1 Feb 2016 |volume=38 |issue=2 |pages=252–265 |doi=10.1109/TPAMI.2015.2448083|pmid=26761732 |arxiv=1403.6015 }}</ref>
 
<ref name="neumann2015">{{cite journal |last1=Neumann |first1=Marion |last2=Huang |first2=Shan |last3=E. Marthaler |first3=Daniel |last4=Kersting |first4=Kristian |title=pyGPs -- A Python Library for Gaussian Process Regression and Classification |journal=Journal of Machine Learning Research |date=2015 |volume=16 |pages=2611–2616 |url=http://jmlr.org/papers/v16/neumann15a.html}}</ref>
 
<ref name="gardner2018">{{cite journal |last1=Gardner |first1=Jacob R |last2=Pleiss |first2=Geoff |last3=Bindel |first3=David |last4=Weinberger |first4=Kilian Q |last5=Wilson |first5=Andrew Gordon |title=GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration |journal=Advances in Neural Information Processing Systems |date=2018 |volume=31 |pages=7576–7586 |arxiv=1809.11165 |url=http://papers.nips.cc/paper/7985-gpytorch-blackbox-matrix-matrix-gaussian-process-inference-with-gpu-acceleration.pdf |accessdate=23 May 2020}}</ref>
Line 628 ⟶ 662:
<ref name="vanhatalo2013">{{cite journal |last1=Vanhatalo |first1=Jarno |last2=Riihimäki |first2=Jaakko |last3=Hartikainen |first3=Jouni |last4=Jylänki |first4=Pasi |last5=Tolvanen |first5=Ville |last6=Vehtari |first6=Aki |title=GPstuff: Bayesian Modeling with Gaussian Processes |journal=Journal of Machine Learning Research |date=Apr 2013 |volume=14 |pages=1175−1179 |url=http://jmlr.csail.mit.edu/papers/v14/vanhatalo13a.html |accessdate=23 May 2020}}</ref>
 
<ref name="marelli2014">{{cite journal |last1=Marelli |first1=Stefano |last2=Sudret |first2=Bruno |title=UQLab: a framework for uncertainty quantification in MATLAB |journal=Vulnerability, Uncertainty, and Risk. Quantification, Mitigation, and Management |date=2014 |pages=2554–2563 |doi=10.3929/ethz-a-010238238 |isbn=978-0-7844-1360-9 |url=https://www.research-collection.ethz.ch/bitstream/handle/20.500.11850/379365/eth-14488-01.pdf?sequence=1&isAllowed=y |accessdate=28 May 2020}}</ref>
 
<ref name="matthews2017">{{cite journal |last1=Matthews |first1=Alexander G. de G. |last2=van der Wilk |first2=Mark |last3=Nickson |first3=Tom |last4=Fujii |first4=Keisuke |last5=Boukouvalas |first5=Alexis |last6=León-Villagrá |first6=Pablo |last7=Ghahramani |first7=Zoubin |last8=Hensman |first8=James |title=GPflow: A Gaussian process library using TensorFlow |journal=Journal of Machine Learning Research |date=April 2017 |volume=18 |issue=40 |pages=1–6 |arxiv=1610.08733 |url=http://jmlr.org/papers/v18/16-537.html |accessdate=6 July 2020}}</ref>
Line 634 ⟶ 668:
<ref name="couckuyt2014">{{cite journal |last1=Couckuyt |first1=Ivo |last2=Dhaene |first2=Tom |last3=Demeester |first3=Piet |title=ooDACE toolbox: a flexible object-oriented Kriging implementation |journal=Journal of Machine Learning Research |date=2014 |volume=15 |pages=3183–3186 |url=http://www.jmlr.org/papers/volume15/couckuyt14a/couckuyt14a.pdf |accessdate=8 July 2020}}</ref>
 
<ref name="zilber2021">{{cite journal |last1=Zilber |first1=Daniel |last2=Katzfuss |first2=Matthias |title=Vecchia–Laplace approximations of generalized Gaussian processes for big non-Gaussian spatial data |journal=Computational Statistics & Data Analysis |date=January 2021 |volume=153 |article-number=107081 |doi=10.1016/j.csda.2020.107081 |arxiv=1906.07828 |s2cid=195068888 |url=https://www.sciencedirect.com/science/article/pii/S0167947320301729 |access-date=1 September 2021 |issn=0167-9473}}</ref>
 
<ref name="kalaitzis2011">{{cite journal |last1=Kalaitzis |first1=Alfredo |last2=Lawrence |first2=Neil D. |title=A Simple Approach to Ranking Differentially Expressed Gene Expression Time Courses through Gaussian Process Regression |journal=BMC Bioinformatics |date=May 20, 2011 |volume=12 |issue=1 |pages=180 |doi=10.1186/1471-2105-12-180 |pmid=21599902 |pmc=3116489 |issn=1471-2105 |doi-access=free }}</ref>
 
<ref name="roustant2012">{{cite journal |last1=Roustant |first1=Olivier |last2=Ginsbourger |first2=David |last3=Deville |first3=Yves |title=DiceKriging, DiceOptim: Two R Packages for the Analysis of Computer Experiments by Kriging-Based Metamodeling and Optimization |journal=Journal of Statistical Software |date=2012 |volume=51 |issue=1 |pages=1–55 |doi=10.18637/jss.v051.i01 |s2cid=60672249 |url=https://www.jstatsoft.org/v51/i01/|doi-access=free }}</ref>
 
<ref name="baudin2015">{{cite journalbook |first1=Michaël |last1=Baudin |first2=Anne |last2=Dutfoy |first3=Bertrand |last3=Iooss |first4=Anne-Laure |last4=Popelin |title=OpenHandbook TURNSof Uncertainty Quantification |chapter=OpenTURNS: An industrialIndustrial softwareSoftware for uncertaintyUncertainty quantificationQuantification in simulationSimulation |date=2015 |pages=1–38 |editor1= Roger Ghanem|editor2= David Higdon|editor3= Houman Owhadi|doi=10.1007/978-3-319-11259-6_64-1 |arxiv=1501.05242|isbn=978-3-319-11259-6 |s2cid=88513894 }}</ref>
 
<ref name="sarkka2013">{{cite journal |last1=Sarkka |first1=Simo |last2=Solin |first2=Arno |last3=Hartikainen |first3=Jouni |title=Spatiotemporal Learning via Infinite-Dimensional Bayesian Filtering and Smoothing: A Look at Gaussian Process Regression Through Kalman Filtering |journal=IEEE Signal Processing Magazine |date=2013 |volume=30 |issue=4 |pages=51–61 |doi=10.1109/MSP.2013.2246292 |urls2cid=https://ieeexplore.ieee.org/abstract/document/65307367485363 |access-date=2 September 2021}}</ref>
 
<ref name="bouhlel2019">{{cite journal |last1=Saves|first1=Paul |last2=Lafage|first2=Rémi |last3=Bartoli |first3=Nathalie |last4=Diouane |first4= Youssef |last5=Bussemaker |first5= Jasper |last6=Lefebvre |first6= Thierry |last7=Hwang |first7= John T. |last8= Morlier |first8= Joseph |last9= Martins |first9= Joaquim R.R.A. |title=SMT 2.0: A Surrogate Modeling Toolbox with a focus on hierarchical and mixed variables Gaussian processes |journal=Advances in Engineering Software |date=2024 |volume=188|issue=1 |pages=103571 |doi=10.1016/j.advengsoft.2023.103571 |url=https://www.sciencedirect.com/science/article/pii/S096599782300162X|arxiv=2305.13998 }}</ref>
 
<ref name="kalaitzis2011saves2024">{{cite journal |last1=Kalaitzis Bouhlel|first1=AlfredoMohamed A. |last2=LawrenceHwang |first2=Neil DJohn T. |titlelast3=ABartoli Simple|first3=Nathalie Approach|last4=Lafage|first4=Rémi to Ranking|last5= DifferentiallyMorlier Expressed|first5= GeneJoseph Expression|last6= TimeMartins Courses|first6= throughJoaquim GaussianR.R.A. Process|title=A RegressionPython surrogate modeling framework with derivatives |journal=BMCAdvances Bioinformaticsin Engineering Software |date=May 20, 2011 2019|volume=12 135|issue=1 |pages=180 102662|doi=10.11861016/1471-2105-12-180j.advengsoft.2019.03.005 |url=https://bmcbioinformaticswww.biomedcentralsciencedirect.com/articlesscience/10.1186article/1471-2105-12-180 |access-date=1 September 2021 |issn=1471-2105pii/S0965997818309360}}</ref>
 
<ref name="roustant2012Porrello24">{{cite journalbook |last1=RoustantPorrello |first1=OlivierChristian |last2=GinsbourgerDubreuil |first2=DavidSylvain |last3=DevilleFarhat |first3=YvesCharbel |title=DiceKriging,AIAA DiceOptim:Aviation TwoForum Rand PackagesAscend for2024 the|chapter=Bayesian AnalysisFramework ofWith Computer Experiments by KrigingProjection-Based MetamodelingModel andOrder OptimizationReduction |journal=Journalfor ofEfficient StatisticalGlobal SoftwareOptimization |date=2012 |volume=51 |issue=12024 |pages=1–554580 |doi=10.186372514/jss6.v051.i012024-4580 |isbn=978-1-62410-716-0 |chapter-url=https://wwwarc.jstatsoftaiaa.org/v51doi/i01abs/10.2514/6.2024-4580}}</ref>
 
<ref name="Lafage2022">{{cite journal |last1=Lafage |first1=Rémi |title=egobox, a Rust toolbox for efficient global optimization |journal=Journal of Open Source Software |date=2022 |volume=7 |issue=78 |pages=4737 |doi=10.21105/joss.04737 |bibcode=2022JOSS....7.4737L |url=https://joss.theoj.org/papers/10.21105/joss.04737.pdf}}</ref>
<ref name="baudin2015">{{cite journal |first1=Michaël |last1=Baudin |first2=Anne |last2=Dutfoy |first3=Bertrand |last3=Iooss |first4=Anne-Laure |last4=Popelin |title=Open TURNS: An industrial software for uncertainty quantification in simulation |date=2015 |arxiv=1501.05242}}</ref>
 
<ref name="sarkka2013">{{cite journal |last1=Sarkka |first1=Simo |last2=Solin |first2=Arno |last3=Hartikainen |first3=Jouni |title=Spatiotemporal Learning via Infinite-Dimensional Bayesian Filtering and Smoothing: A Look at Gaussian Process Regression Through Kalman Filtering |journal=IEEE Signal Processing Magazine |date=2013 |volume=30 |issue=4 |pages=51–61 |doi=10.1109/MSP.2013.2246292 |url=https://ieeexplore.ieee.org/abstract/document/6530736 |access-date=2 September 2021}}</ref>
 
}}