Results 51 to 60 of about 6,914,944 (265)
Efficient Digital Predistortion Using Sparse Neural Network
This paper proposes an efficient neural-network-based digital predistortion (DPD), named as envelope time-delay neural network (ETDNN) DPD. The method complies with the physical characteristics of radio-frequency (RF) power amplifiers (PAs) and uses a ...
Masaaki Tanio +2 more
doaj +1 more source
Neural network models for hyperspectral images classification are complex and therefore difficult to deploy directly onto mobile platforms. Neural network model compression methods can effectively optimize the storage space and inference time of the ...
Yu Lei +5 more
doaj +1 more source
When approaching a novel visual recognition problem in a specialized image domain, a common strategy is to start with a pre-trained deep neural network and fine-tune it to the specialized domain.
Mori, Greg +2 more
core +1 more source
The enormous inference cost of deep neural networks can be mitigated by network compression. Pruning connections is one of the predominant approaches used for network compression.
Sai Aparna Aketi +3 more
doaj +1 more source
Activation-Based Pruning of Neural Networks
We present a novel technique for pruning called activation-based pruning to effectively prune fully connected feedforward neural networks for multi-object classification. Our technique is based on the number of times each neuron is activated during model
Tushar Ganguli, Edwin K. P. Chong
doaj +1 more source
Providing clear pruning threshold: A novel CNN pruning method via L0 regularisation
Network pruning is a significant way to improve the practicability of convolution neural networks (CNNs) by removing the redundant structure of the network model. However, in most of the existing network pruning methods l1 or l2 regularisation is applied
Guo Li, Gang Xu
doaj +1 more source
Cluster-Based Structural Redundancy Identification for Neural Network Compression
The increasingly large structure of neural networks makes it difficult to deploy on edge devices with limited computing resources. Network pruning has become one of the most successful model compression methods in recent years.
Tingting Wu +3 more
doaj +1 more source
Adaptive Neural Network Structure Optimization Algorithm Based on Dynamic Nodes
Large-scale artificial neural networks have many redundant structures, making the network fall into the issue of local optimization and extended training time.
Miao Wang +7 more
doaj +1 more source
Automated Pruning for Deep Neural Network Compression
In this work we present a method to improve the pruning step of the current state-of-the-art methodology to compress neural networks. The novelty of the proposed pruning technique is in its differentiability, which allows pruning to be performed during ...
Bianco, Simone +4 more
core +1 more source
Neural network pruning offers great prospects for facilitating the deployment of deep neural networks on computational resource limited devices.
Hanjing Cheng +5 more
doaj +1 more source

