Multi-objective optimization: Difference between revisions

Content deleted Content added
WikiCleanerBot (talk | contribs)
m v2.05b - Bot T20 CW#61 - Fix errors for CW project (Reference before punctuation)
Line 197:
 
=== Smooth Chebyshev (Tchebycheff) scalarization ===
The '''smooth Chebyshev scalarization''';<ref name="Lin2024">Lin, X.; Zhang, X.; Yang, Z.; Liu, F.; Wang, Z.; Zhang, Q. (2024). “Smooth Tchebycheff Scalarization for Multi-Objective Optimization”. ''arXiv preprint'' [[arXiv:2402.19078]].</ref>; also called smooth Tchebycheff scalarisation (STCH); replaces the non-differentiable max-operator of the classical Chebyshev scalarization with a smooth logarithmic soft-max, making standard gradient-based optimization applicable. Unlike typical scalarization methods, it guarantees exploration of the entire Pareto front, convex or concave.
 
;Definition