Pruning (artificial neural network): Difference between revisions

Content deleted Content added
top: +biology analogies
Frankplow (talk | contribs)
Rollback edits by 132.71.85.141 — self-promotion: all their edits are adding reference to papers by the same author(s)
Line 9:
#Remove the least important neuron.
#Check a termination condition (to be determined by the user) to see whether to continue pruning.
 
Recently a highly pruned three layer tree architecture, has achieved a similar success rate to that of LeNet-5 on the CIFAR-10 dataset with a lesser computational complexity.<ref>{{Cite journal |last1=Meir |first1=Yuval |last2=Ben-Noam |first2=Itamar |last3=Tzach |first3=Yarden |last4=Hodassman |first4=Shiri |last5=Kanter |first5=Ido |date=2023-01-30 |title=Learning on tree architectures outperforms a convolutional feedforward network |journal=Scientific Reports |language=en |volume=13 |issue=1 |pages=962 |doi=10.1038/s41598-023-27986-6 |issn=2045-2322 |pmc=9886946 |pmid=36717568}}</ref>
 
== Edge (weight) pruning ==