Results 251 to 260 of about 79,498 (272)
Some of the next articles are maybe not open access.
Pruned and Structurally Sparse Neural Networks
2018 IEEE MIT Undergraduate Research Technology Conference (URTC), 2018Advances in designing and training deep neural networks have led to the principle that the large and deeper a network is, the better it can perform. As a result, computational resources have become a key limiting factor in achieving better performance.
Simon Alford +3 more
openaire +2 more sources
A new method to prune the neural network
Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium, 2000Using the backpropagation algorithm (BP) to train neural networks is a widely adopted practice in both theory and practical applications. However, its distributed weight representation, that is the weight matrix of final network after training by using BP are usually not sparsified, and prohibits its use in the rule discovery of inherent functional ...
Weishui Wan +3 more
openaire +1 more source
Pruning versus clipping in neural networks
Physical Review A, 1989The number of interconnections in a neutral network is reduced by eliminating the ``weakest'' bonds. The performance is then improved by reapplying the learning algorithm.
openaire +2 more sources
A LOCAL TRAINING AND PRUNING APPROACH FOR NEURAL NETWORKS
International Journal of Neural Systems, 2000The training of neural networks using the extended Kalman filter (EKF) algorithm is plagued by the drawback of high computational complexity and storage requirement that may become prohibitive even for networks of moderate size. In this paper, we present a local EKF training and pruning approach that can solve this problem.
Sheng-Jiang Chang +3 more
openaire +2 more sources
Statistical method of pruning neural networks
IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339), 2003A statistical method of pruning multilayer perceptrons is proposed which is expected to involve less computation than does "optimal brain surgeon". The statistical method iteratively performs the steps of estimating the error covariance of the weights, evaluating the z-statistics for the weights, pruning the weights selected by hypothesis testing, and ...
openaire +1 more source
Pruning and quantization for deep neural network acceleration: A survey
Neurocomputing, 2021John Glossner, Xiaotong Zhang
exaly
GenExp: Multi-objective pruning for deep neural network based on genetic algorithm
Neurocomputing, 2021Dong Wang
exaly
Intermittent-Aware Neural Network Pruning
2023 60th ACM/IEEE Design Automation Conference (DAC), 2023Chih-Chia Lin +4 more
openaire +1 more source
DMPP: Differentiable multi-pruner and predictor for neural network pruning
Neural Networks, 2022Derong Liu, Bo Zhao
exaly

