Content deleted Content added
AzzurroLan (talk | contribs) |
AzzurroLan (talk | contribs) |
||
Line 113:
* First, we present the pseudocode for the ADAM algorithm as follows<ref name="Adam2014">{{cite arXiv |first1=Diederik |last1=Kingma |first2=Jimmy |last2=Ba |eprint=1412.6980 |title=Adam: A Method for Stochastic Optimization |year=2014 |class=cs.LG }}</ref>:
===Adam<ref name="Adam2014">{{cite arXiv |first1=Diederik |last1=Kingma |first2=Jimmy |last2=Ba |eprint=1412.6980 |title=Adam: A Method for Stochastic Optimization |year=2014 |class=cs.LG }}</ref> (short for Adaptive Moment Estimation) algorithm===
This function implements the Adam
'''Function:''' ADAM(<math>\alpha</math>, <math>\beta_1</math>, <math>\beta_2</math>, <math>\epsilon</math>, <math>\mathcal{G}(\theta)</math>, <math>\theta_0</math>) '''is'''
▲ ''// for minimizing the target function <math>\mathcal{G}(\theta)</math>.''
<math>m_0 := 0</math> ''// Initialize the first moment vector''
Line 140 ⟶ 139:
* With the ADAM algorithm described above, we now present the pseudocode corresponding to a multilayer feedforward neural network:
===Backpropagation algorithm<ref name="DLhistory">{{cite arXiv |eprint=2212.11279 |class=cs.NE |first=Juergen |last=Schmidhuber |author-link=Juergen Schmidhuber |title=Annotated History of Modern AI and Deep Learning |date=2022}}</ref> for multilayer feedforward neural networks===
This function implements the backpropagation
'''Function:''' BackPropagation(''set'' <math>D=\left\{(\mathbf{x}_k,\mathbf{y}_k)\right\}_{k=1}^{m}</math>) '''is'''
▲ ''// for training a multi-layer feedforward neural network.''
''// Step 1: Random initialization''
''// Step 2: Optimization loop''
Line 173 ⟶ 169:
===Numerical solution for optimal investment portfolio<ref name="Han2018" />===
'''function''' OptimalInvestment(<math>W_{t_{i+1}} - W_{t_i}</math>, <math>x</math>, <math>\theta=(X_{0}, H_{0}, \theta_{1}, \theta_{2}, \dots, \theta_{N-1})</math>) '''is'''
▲ ''// This function calculates the optimal investment portfolio using''
''// Step 1: Initialization''
'''for''' <math>k := 0</math> '''to''' maxstep '''do'''
|