Content deleted Content added
m →Greedy algorithm: ch "stop criterium" → "stopping criterion" |
m Open access bot: hdl updated in citation with #oabot. |
||
Line 39:
== Sparse Subspace Learning ==
{{Confusing section|date=April 2021}}
The [https://hal.archives-ouvertes.fr/hal-01925360/document Sparse Subspace Learning] (SSL) method leverages the use of hierarchical collocation to approximate the numerical solution of parametric models. With respect to traditional projection-based reduced order modeling, the use of a collocation enables non-intrusive approach based on sparse adaptive sampling of the parametric space. This allows to recover the lowdimensional structure of the parametric solution subspace while also learning the functional dependency from the parameters in explicit form. A sparse low-rank approximate tensor representation of the parametric solution can be built through an incremental strategy that only needs to have access to the output of a deterministic solver. Non-intrusiveness makes this approach straightforwardly applicable to challenging problems characterized by nonlinearity or non affine weak forms.<ref>{{Cite journal|last1=Borzacchiello|first1=Domenico|last2=Aguado|first2=José V.|last3=Chinesta|first3=Francisco|date=April 2019|title=Non-intrusive Sparse Subspace Learning for Parametrized Problems|url=http://link.springer.com/10.1007/s11831-017-9241-4|journal=Archives of Computational Methods in Engineering|language=en|volume=26|issue=2|pages=303–326|doi=10.1007/s11831-017-9241-4|s2cid=126121268 |issn=1134-3060|hdl=10985/18435|hdl-access=free}}</ref>
== References ==
|