Meta-learning (computer science): Difference between revisions

Content deleted Content added
Citation bot (talk | contribs)
m Alter: journal. | You can use this bot yourself. Report bugs here.| Activated by User:Nemo bis | via #UCB_webform
LSTM Meta-Learner: fixed dab links (deleted dab for convergence as this is the normal dictionary definition of the word
Line 56:
 
====LSTM Meta-Learner====
LSTM-based meta-learner is to learn the exact [[optimization algorithm]] used to train another learner [[neural network]] [[classification rule|classifier]]{{dn|date=November 2019}} in the few-shot regime. The parametrization allows it to learn appropriate parameter updates specifically for the [[scenario]] where a set amount of updates will be made, while also learning a general initialization of the learner (classifier) network that allows for quick [[convergence]]{{dn|date=November 2019}} of training.<ref name="paper8">[https://openreview.net/pdf?id=rJY0-Kcll] Sachin Ravi∗and Hugo Larochelle(2017).” Optimization as a model for few-shot learning”. ICLR 2017. Retrieved 3 November, 2019</ref>
 
====Temporal Discreteness====