Neural architecture search: Difference between revisions

Content deleted Content added
Wsafari (talk | contribs)
Added Bayesian Optimization
Line 19:
== Evolution ==
Several groups employed [[evolutionary algorithm]]s for NAS.<ref>{{cite arXiv|last1=Real|first1=Esteban|last2=Moore|first2=Sherry|last3=Selle|first3=Andrew|last4=Saxena|first4=Saurabh|last5=Suematsu|first5=Yutaka Leon|last6=Tan|first6=Jie|last7=Le|first7=Quoc|last8=Kurakin|first8=Alex|date=2017-03-03|title=Large-Scale Evolution of Image Classifiers|eprint=1703.01041|class=cs.NE}}</ref><ref name="Real 2018">{{cite arXiv|last1=Real|first1=Esteban|last2=Aggarwal|first2=Alok|last3=Huang|first3=Yanping|last4=Le|first4=Quoc V.|date=2018-02-05|title=Regularized Evolution for Image Classifier Architecture Search|eprint=1802.01548|class=cs.NE}}</ref><ref>Stanley, Kenneth; Miikkulainen, Risto, "[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.28.5457&rep=rep1&type=pdf Evolving Neural Networks through Augmenting Topologies]", in: Evolutionary Computation, 2002</ref>. An Evolutionary Algorithm for Neural Architecture Search generally performs the following procedure<ref name="liu2021survey">{{cite arXiv|last1=Liu|first1=Yuqiao|last2=Sun|first2=Yanan|last3=Xue|first3=Bing|last3=Zhang|first3=Mengjie|last3=Yen|first3=Gary G|last3=Tan|first3=Kay Chen|date=2020-08-25|title=A Survey on Evolutionary Neural Architecture Search|eprint=2008.10937|class=cs.NE}}</ref>. First a pool consisting of different candidate architectures along with their validation scores (fitness) is initialised. At each step the architectures in the candidate pool are mutated (eg: 3x3 convolution instead of a 5x5 convolution). Next the new architectures are trained from scratch for a few epochs and their validation scores are obtained. This is followed by replacing the lowest scoring architectures in the candidate pool with the better, newer architectures. This procedure is repeated multiple times and thus the candidate pool is refined over time. Mutations in the context of evolving ANNs are operations such as adding a layer, removing a layer or changing the type of a layer (e.g., from convolution to pooling). On [[CIFAR-10]], evolution and RL performed comparably, while both outperformed [[random search]].<ref name="Real 2018" />
 
== Bayesian Optimization ==
Bayesian Optimization which has proven to be an efficient method for hyperparameter optimization can also be applied to NAS. In this context the objective function maps an architecture to its validation error after being trained. At each iteration BO uses a surrogate to model this objective function based on previously obtained architectures and their validation errors. One then chooses the next architecture to evaluate by maximizing an acquisition function, such as expected improvement, which provides a balance between exploration and exploitation. Acquisition function maximization and objective function evaluation are often computationally expensive for NAS, and make the application of BO challenging in this context. Recently BANANAS<ref>{{Cite journal|last=White|first=Colin|last2=Neiswanger|first2=Willie|last3=Savani|first3=Yash|date=2020-11-02|title=BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search|url=http://arxiv.org/abs/1910.11858|journal=arXiv:1910.11858 [cs, stat]}}</ref> has achieved promising results in this direction by introducing a high-performing instantiation of BO coupled to a neural predictor.
 
==Hill-climbing==