Results 51 to 60 of about 6,914,944 (265)

Efficient Digital Predistortion Using Sparse Neural Network

open access: yesIEEE Access, 2020
This paper proposes an efficient neural-network-based digital predistortion (DPD), named as envelope time-delay neural network (ETDNN) DPD. The method complies with the physical characteristics of radio-frequency (RF) power amplifiers (PAs) and uses a ...
Masaaki Tanio   +2 more
doaj   +1 more source

Network Collaborative Pruning Method for Hyperspectral Image Classification Based on Evolutionary Multi-Task Optimization

open access: yesRemote Sensing, 2023
Neural network models for hyperspectral images classification are complex and therefore difficult to deploy directly onto mobile platforms. Neural network model compression methods can effectively optimize the storage space and inference time of the ...
Yu Lei   +5 more
doaj   +1 more source

Fine-Pruning: Joint Fine-Tuning and Compression of a Convolutional Network with Bayesian Optimization

open access: yes, 2017
When approaching a novel visual recognition problem in a specialized image domain, a common strategy is to start with a pre-trained deep neural network and fine-tune it to the specialized domain.
Mori, Greg   +2 more
core   +1 more source

Gradual Channel Pruning While Training Using Feature Relevance Scores for Convolutional Neural Networks

open access: yesIEEE Access, 2020
The enormous inference cost of deep neural networks can be mitigated by network compression. Pruning connections is one of the predominant approaches used for network compression.
Sai Aparna Aketi   +3 more
doaj   +1 more source

Activation-Based Pruning of Neural Networks

open access: yesAlgorithms
We present a novel technique for pruning called activation-based pruning to effectively prune fully connected feedforward neural networks for multi-object classification. Our technique is based on the number of times each neuron is activated during model
Tushar Ganguli, Edwin K. P. Chong
doaj   +1 more source

Providing clear pruning threshold: A novel CNN pruning method via L0 regularisation

open access: yesIET Image Processing, 2021
Network pruning is a significant way to improve the practicability of convolution neural networks (CNNs) by removing the redundant structure of the network model. However, in most of the existing network pruning methods l1 or l2 regularisation is applied
Guo Li, Gang Xu
doaj   +1 more source

Cluster-Based Structural Redundancy Identification for Neural Network Compression

open access: yesEntropy, 2022
The increasingly large structure of neural networks makes it difficult to deploy on edge devices with limited computing resources. Network pruning has become one of the most successful model compression methods in recent years.
Tingting Wu   +3 more
doaj   +1 more source

Adaptive Neural Network Structure Optimization Algorithm Based on Dynamic Nodes

open access: yesCurrent Issues in Molecular Biology, 2022
Large-scale artificial neural networks have many redundant structures, making the network fall into the issue of local optimization and extended training time.
Miao Wang   +7 more
doaj   +1 more source

Automated Pruning for Deep Neural Network Compression

open access: yes, 2019
In this work we present a method to improve the pruning step of the current state-of-the-art methodology to compress neural networks. The novelty of the proposed pruning technique is in its differentiability, which allows pruning to be performed during ...
Bianco, Simone   +4 more
core   +1 more source

Differentiable channel pruning guided via attention mechanism: a novel neural network pruning approach

open access: yesComplex & Intelligent Systems, 2023
Neural network pruning offers great prospects for facilitating the deployment of deep neural networks on computational resource limited devices.
Hanjing Cheng   +5 more
doaj   +1 more source

Home - About - Disclaimer - Privacy