TY - GEN
T1 - A Neural Network pruning approach based on Compressive Sampling
AU - Yang, Jie
AU - Bouzerdoum, Abdesselam
AU - Phung, Son Lam
PY - 2009
Y1 - 2009
N2 - The balance between computational complexity and the architecture bottlenecks the development of Neural Networks (NNs). An architecture that is too large or too small will influence the performance to a large extent in terms of generalization and computational cost. In the past, saliency analysis has been employed to determine the most suitable structure, however, it is time-consuming and the performance is not robust. In this paper, a family of new algorithms for pruning elements (weighs and hidden neurons) in Neural Networks is presented based on Compressive Sampling (CS) theory. The proposed framework makes it possible to locate the significant elements, and hence find a sparse structure, without computing their saliency. Experiment results are presented which demonstrate the effectiveness of the proposed approach.
AB - The balance between computational complexity and the architecture bottlenecks the development of Neural Networks (NNs). An architecture that is too large or too small will influence the performance to a large extent in terms of generalization and computational cost. In the past, saliency analysis has been employed to determine the most suitable structure, however, it is time-consuming and the performance is not robust. In this paper, a family of new algorithms for pruning elements (weighs and hidden neurons) in Neural Networks is presented based on Compressive Sampling (CS) theory. The proposed framework makes it possible to locate the significant elements, and hence find a sparse structure, without computing their saliency. Experiment results are presented which demonstrate the effectiveness of the proposed approach.
UR - https://www.scopus.com/pages/publications/70449448556
U2 - 10.1109/IJCNN.2009.5179045
DO - 10.1109/IJCNN.2009.5179045
M3 - Conference contribution
AN - SCOPUS:70449448556
SN - 9781424435531
T3 - Proceedings of the International Joint Conference on Neural Networks
SP - 3428
EP - 3435
BT - 2009 International Joint Conference on Neural Networks, IJCNN 2009
T2 - 2009 International Joint Conference on Neural Networks, IJCNN 2009
Y2 - 14 June 2009 through 19 June 2009
ER -