Sharpness aware minimization: Difference between revisions

Content deleted Content added
m Theroadislong moved page Draft:Sharpness aware minimization to Sharpness aware minimization: Publishing accepted Articles for creation submission (AFCH)
Cleaning up accepted Articles for creation submission (AFCH)
Line 4:
{{AI-generated|date=June 2025}}
}}
{{Draft topics|northern-africa|stem}}
{{AfC topic|stem}}
{{AfC submission|||ts=20250530075726|u=2600:1700:3EC7:2CC0:5B82:5CA9:6050:DFE|ns=118}}
{{AFC submission|d|npov|u=2600:1700:3EC7:2CC0:F69C:2A90:F461:223D|ns=118|decliner=ToadetteEdit|declinets=20250522114510|ts=20250522053247}}
<!-- Do not remove this line! -->
 
<!-- Important, do not remove anything above this line before article has been created. -->
 
'''Sharpness Aware Minimization''' ('''SAM''') is an [[optimization algorithm]] used in [[machine learning]] that aims to improve model [[generalization (machine learning)|generalization]]. The method seeks to find model parameters that are located in regions of the loss landscape with uniformly low loss values, rather than parameters that only achieve a minimal loss value at a single point. This approach is described as finding "flat" minima instead of "sharp" ones. The rationale is that models trained this way are less sensitive to variations between training and test [[data set|data]], which can lead to better performance on unseen data.<ref name="Foret2021">{{cite conference |last1=Foret |first1=Pierre |last2=Kleiner |first2=Ariel |last3=Mobahi |first3=Hossein |last4=Neyshabur |first4=Behnam |title=Sharpness-Aware Minimization for Efficiently Improving Generalization |book-title=International Conference on Learning Representations (ICLR) 2021 |year=2021 |arxiv=2010.01412 |url=https://openreview.net/forum?id=6Tm1m_rRrwY}}</ref>