Content deleted Content added
Fixed reference date error(s) (see CS1 errors: dates for details) and AWB general fixes |
|||
(13 intermediate revisions by 12 users not shown) | |||
Line 1:
{{Short description|Trimming artificial neural networks to reduce computational overhead}}
{{other uses|Pruning (disambiguation)}}
In
== Node (neuron) pruning ==
A basic algorithm for pruning is as follows:<ref>Molchanov, P., Tyree, S., Karras, T., Aila, T., & Kautz, J. (2016). ''Pruning convolutional neural networks for resource efficient inference''. arXiv preprint arXiv:1611.06440.</ref><ref>
#Evaluate the importance of each neuron.
#Rank the neurons according to their importance (assuming there is a clearly defined measure for "importance").
#Remove the least important neuron.
#Check a termination condition (to be determined by the user) to see whether to continue pruning.
== Edge (weight) pruning ==
Most work on neural network pruning focuses on removing weights, namely, setting their values to zero. Early work suggested to also change the values of non-pruned weights.<ref>{{Cite journal |
▲Early work suggested to also change the values of non-pruned weights.<ref>{{Cite journal |last=Chechik |first=Gal |last2=Meilijson |first2=Isaac |last3=Ruppin |first3=Eytan |date=2001-04 |title=Effective Neuronal Learning with Ineffective Hebbian Learning Rules |url=https://ieeexplore.ieee.org/abstract/document/6789989 |journal=Neural Computation |volume=13 |issue=4 |pages=817–840 |doi=10.1162/089976601300014367 |issn=0899-7667}}</ref>
== See also ==
* [[Knowledge distillation]]
* [[Neural Darwinism]]
== References ==
Line 21 ⟶ 23:
{{
|