Results 31 to 40 of about 79,498 (272)

Network Collaborative Pruning Method for Hyperspectral Image Classification Based on Evolutionary Multi-Task Optimization

open access: yesRemote Sensing, 2023
Neural network models for hyperspectral images classification are complex and therefore difficult to deploy directly onto mobile platforms. Neural network model compression methods can effectively optimize the storage space and inference time of the ...
Yu Lei   +5 more
doaj   +1 more source

Unsupervised Adaptive Weight Pruning for Energy-Efficient Neuromorphic Systems

open access: yesFrontiers in Neuroscience, 2020
To tackle real-world challenges, deep and complex neural networks are generally used with a massive number of parameters, which require large memory size, extensive computational operations, and high energy consumption in neuromorphic hardware systems ...
Wenzhe Guo   +7 more
doaj   +1 more source

Neuroinspired unsupervised learning and pruning with subquantum CBRAM arrays. [PDF]

open access: yes, 2018
Resistive RAM crossbar arrays offer an attractive solution to minimize off-chip data transfer and parallelize on-chip computations for neural networks.
Jameson, John R   +6 more
core   +3 more sources

Automatic Pruning for Quantized Neural Networks

open access: yesCoRR, 2020
Neural network quantization and pruning are two techniques commonly used to reduce the computational complexity and memory footprint of these models for deployment. However, most existing pruning strategies operate on full-precision and cannot be directly applied to discrete parameter distributions after quantization.
Luis Guerra   +3 more
openaire   +2 more sources

Neural Network-Based Fixed-Complexity Precoder Selection for Multiple Antenna Systems

open access: yesIEEE Access, 2022
In this paper, we propose a neural network-based precoder selection method for multiple antenna systems that are equipped with maximum likelihood detectors.
Jaekwon Kim, Hyo-Sang Lim
doaj   +1 more source

Importance Estimation for Neural Network Pruning [PDF]

open access: yes2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019
Structural pruning of neural network parameters reduces computation, energy, and memory transfer costs during inference. We propose a novel method that estimates the contribution of a neuron (filter) to the final loss and iteratively removes those with smaller scores.
Pavlo Molchanov 0001   +4 more
openaire   +2 more sources

Neural Network Pruning by Gradient Descent

open access: yesCoRR, 2023
21 pages, 5 ...
Zhang Zhang, Ruyi Tao, Jiang Zhang
openaire   +2 more sources

Dissecting the Biological Motherboard (Systems Biology and Beyond) [PDF]

open access: yes, 2008
Genome-scale molecular networks, including gene pathways, gene regulatory networks and protein interactions, are central to the investigation of the nascent disciplines of systems biology and bio-complexity.
Abhay Krishna, Ajit Narayanan
core   +2 more sources

Renormalized Sparse Neural Network Pruning

open access: yesCoRR, 2022
Large neural networks are heavily over-parameterized. This is done because it improves training to optimality. However once the network is trained, this means many parameters can be zeroed, or pruned, leaving an equivalent sparse neural network. We propose renormalizing sparse neural networks in order to improve accuracy.
openaire   +2 more sources

Neural Networks at a Fraction with Pruned Quaternions

open access: yesProceedings of the 6th Joint International Conference on Data Science & Management of Data (10th ACM IKDD CODS and 28th COMAD), 2023
Contemporary state-of-the-art neural networks have increasingly large numbers of parameters, which prevents their deployment on devices with limited computational power. Pruning is one technique to remove unnecessary weights and reduce resource requirements for training and inference. In addition, for ML tasks where the input data is multi-dimensional,
Sahel Mohammad Iqbal, Subhankar Mishra
openaire   +2 more sources

Home - About - Disclaimer - Privacy