Results 241 to 250 of about 79,498 (272)
Pruning by explaining: A novel criterion for deep neural network pruning
The success of convolutional neural networks (CNNs) in various applications is accompanied by a significant increase in computation and parameter storage costs. Recent efforts to reduce these overheads involve pruning and compressing the weights of various layers while at the same time aiming to not sacrifice performance.
Seul-Ki Yeom +2 more
exaly +5 more sources
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Optimal pruning in neural networks
Physical Review E, 2000We study pruning strategies in simple perceptrons subjected to supervised learning. Our analytical results, obtained through the statistical mechanics approach to learning theory, are independent of the learning algorithm used in the training process.
D M, Barbato, O, Kinouchi
openaire +2 more sources
Growing and pruning neural tree networks
IEEE Transactions on Computers, 1993A pattern classification method called neural tree networks (NTNs) is presented. The NTN consists of neural networks connected in a tree architecture. The neural networks are used to recursively partition the feature space into subregions. Each terminal subregion is assigned a class label which depends on the training data routed to it by the neural ...
Ananth Sankar, Richard J. Mammone
openaire +1 more source
Neural network pruning and hardware acceleration
2020 IEEE/ACM 13th International Conference on Utility and Cloud Computing (UCC), 2020Neural network pruning is a critical technique to efficiently deploy neural network models on edge devices with limited computing resources. Although many neural network pruning methods have been published, it is difficult to implement such algorithms due to their inherent complexity.
Taehee Jeong +4 more
openaire +1 more source
Pruning in Recurrent Neural Networks
1994Recurrent neural networks are attracting considerable interest within the neural network domain especially because of their potential in such problems as pattern completion and temporal sequence processing (Almeida, 1987; Hertz et al., 1991). As for feed-forward networks, in virtually all problems of interest the proper number of hidden units is not ...
CASTELLANO G +2 more
openaire +2 more sources
Pruned Neural Networks for Regression
2000Neural networks have been widely used as a tool for regression. They are capable of approximating any function and they do not require any assumption about the distribution of the data. The most commonly used architectures for regression are the feedforward neural networks with one or more hidden layers.
Rudy Setiono, Wee Kheng Leow
openaire +1 more source
Variational Convolutional Neural Network Pruning
2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019We propose a variational Bayesian scheme for pruning convolutional neural networks in channel level. This idea is motivated by the fact that deterministic value based pruning methods are inherently improper and unstable. In a nutshell, variational technique is introduced to estimate distribution of a newly proposed parameter, called channel saliency ...
Chenglong Zhao +5 more
openaire +1 more source
Neural network pruning for function approximation
Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium, 2000A simple algorithm for pruning feedforward neural networks with a single hidden layer trained for function approximation is presented. The algorithm assumes that the networks have been trained with more then the necessary number of hidden units and it consists of two stages. In the first stage redundant hidden units are removed, and in the second stage
Rudy Setiono, Adam E. Gaweda
openaire +1 more source

