Content deleted Content added
Azulejoazul (talk | contribs) Add TuringBot |
m v2.04b - Bot T20 CW#61 - Fix errors for CW project (Reference before punctuation) |
||
Line 20:
== Benchmarking ==
In 2022 the results of a large benchmarking competition known as [[SRBench]] were announced at the [[Genetic and Evolutionary Computation Conference|GECCO conference]] in Boston, MA. The competition pitted nine leading symbolic regression algorithms against each other on a large set of data problems and evaluation criteria.<ref name="srbench2022" />
=== Synthetic Track ===
Line 43:
=== End-user software ===
* [[QLattice]] is a quantum-inspired simulation and machine learning technology that helps you search through an infinite list of potential mathematical models to solve your problem.<ref>{{Cite web|url=https://docs.abzu.ai|title=Feyn is a Python module for running the QLattice|date=June 22, 2022}}</ref><ref name="srfeyn" />
* [[Deep Symbolic Optimization]] is a deep learning framework for symbolic optimization tasks<ref>{{Cite web|url=https://github.com/brendenpetersen/deep-symbolic-optimization|title=Deep symbolic optimization|date=June 22, 2022}}</ref>
* [[dCGP]], differentiable Cartesian Genetic Programming in python (free, open source) <ref>{{Cite web|url=https://darioizzo.github.io/dcgp/|title=Differentiable Cartesian Genetic Programming, v1.6 Documentation|date=June 10, 2022}}</ref><ref>{{Cite journal|title=Differentiable genetic programming|first1=Dario|last1=Izzo|first2=Francesco|last2=Biscani|first3=Alessio|last3=Mereta|journal=Proceedings of the European Conference on Genetic Programming|year=2016 |arxiv=1611.04766 }}</ref>
Line 66:
== References ==
{{reflist|refs=.<ref name="bayesian">{{cite arXiv
<ref name="bayesian">{{cite arXiv▼
| title = Bayesian Symbolic Regression
| author1 = Ying Jin
Line 78 ⟶ 76:
| class = stat.ME
| eprint = 1910.08892
}}</ref><ref name="schmidt2009distilling">{{cite journal▼
▲<ref name="schmidt2009distilling">{{cite journal
| title = Distilling free-form natural laws from experimental data
| author1 = Michael Schmidt
Line 96 ⟶ 92:
| citeseerx = 10.1.1.308.2245
| s2cid = 7366016
<ref name="aifeynman">{{cite journal▼
| title = AI Feynman: A physics-inspired method for symbolic regression
| author1 = Silviu-Marian Udrescu
Line 111 ⟶ 106:
| pmc = 7159912
| bibcode = 2020SciA....6.2631U
}}</ref><ref name="ufo">{{cite journal
<ref name="ufo">{{cite journal▼
| title = Universal Functions Originator
| author1 = Ali R. Al-Roomi
Line 125 ⟶ 119:
| doi = 10.1016/j.asoc.2020.106417
| s2cid = 219743405
| title = Order of nonlinearity as a complexity measure for models generated by symbolic regression via pareto genetic programming
| author1 = Ekaterina J. Vladislavleva
Line 139 ⟶ 132:
| doi=10.1109/tevc.2008.926486
| s2cid = 12072764
| title = Exact Learning
| author = Benedict W. J. Irwin
Line 147 ⟶ 139:
| doi = 10.21203/rs.3.rs-149856/v1
| s2cid = 234014141
}}</ref><ref name="srbench2022">{{cite web
|title = SRBench Competition 2022: Interpretable Symbolic Regression for Data Science
|author1 = Michael Kommenda
Line 156 ⟶ 147:
|author5 = Marco Virgolin
|url = https://cavalab.org/srbench/competition-2022/
}}</ref><ref name="srfeyn">{{cite arXiv
| author1 = Kevin René Broløs
| author2 = Meera Vieira Machado
Line 170 ⟶ 160:
| class = cs.LG
| eprint = 2104.05417
}}</ref>
}}
== Further reading ==
|