Content deleted Content added
→Edge (weight) pruning: Fix cite date error |
→top: +biology analogies |
||
Line 1:
{{Orphan|date=June 2020}}
In the context of [[artificial neural network]], '''pruning''' is the practice of removing [[parameter]]s (which may entail removing individual parameters, or parameters in groups such as by [[artificial neurons|neurons]]) from an existing network.<ref>{{cite arXiv|last1=Blalock|first1=Davis|last2=Ortiz|first2=Jose Javier Gonzalez|last3=Frankle|first3=Jonathan|last4=Guttag|first4=John|date=2020-03-06|title=What is the State of Neural Network Pruning?|class=cs.LG|eprint=2003.03033}}</ref> The goal of this process is to maintain accuracy of the network while increasing its [[efficiency]]. This can be done to reduce the [[computational resource]]s required to run the neural network. A biological process of [[synaptic pruning]] takes place in the brain of mammals during development
== Node (neuron) pruning ==
|