Results 21 to 30 of about 6,914,944 (265)

Pruning Method for Convolutional Neural Network Models Based on Sparse Regularization [PDF]

open access: yesJisuanji gongcheng, 2021
The existing pruning algorithms for Convolutional Neural Network(CNN) models exhibit a low accuracy in evaluating the importance of parameters by relying on their own parameter information, which would easily lead to mispruning and affect the performance
WEI Yue, CHEN Shichao, ZHU Fenghua, XIONG Gang
doaj   +1 more source

Evolutionary Multi-Objective One-Shot Filter Pruning for Designing Lightweight Convolutional Neural Network

open access: yesSensors, 2021
Deep neural networks have achieved significant development and wide applications for their amazing performance. However, their complex structure, high computation and storage resource limit their applications in mobile or embedding devices such as sensor
Tao Wu   +4 more
doaj   +1 more source

Sparse Double Descent: Where Network Pruning Aggravates Overfitting [PDF]

open access: yesInternational Conference on Machine Learning, 2022
People usually believe that network pruning not only reduces the computational cost of deep networks, but also prevents overfitting by decreasing model capacity.
Zhengqi He   +3 more
semanticscholar   +1 more source

Progressive multi-level distillation learning for pruning network

open access: yesComplex & Intelligent Systems, 2023
Although the classification method based on the deep neural network has achieved excellent results in classification tasks, it is difficult to apply to real-time scenarios because of high memory footprints and prohibitive inference times.
Ruiqing Wang   +9 more
doaj   +1 more source

Rethinking Network Pruning – under the Pre-train and Fine-tune Paradigm [PDF]

open access: yesNorth American Chapter of the Association for Computational Linguistics, 2021
Transformer-based pre-trained language models have significantly improved the performance of various natural language processing (NLP) tasks in the recent years.
Dongkuan Xu   +3 more
semanticscholar   +1 more source

Network Pruning Using Adaptive Exemplar Filters [PDF]

open access: yesIEEE Transactions on Neural Networks and Learning Systems, 2021
Popular network pruning algorithms reduce redundant information by optimizing hand-crafted models, and may cause suboptimal performance and long time in selecting filters.
Mingbao Lin   +6 more
semanticscholar   +1 more source

Optimization Based Layer-Wise Pruning Threshold Method for Accelerating Convolutional Neural Networks

open access: yesMathematics, 2023
Among various network compression methods, network pruning has developed rapidly due to its superior compression performance. However, the trivial pruning threshold limits the compression performance of pruning.
Yunlong Ding, Di-Rong Chen
doaj   +1 more source

Cloud–Edge Collaborative Inference with Network Pruning

open access: yesElectronics, 2023
With the increase in model parameters, deep neural networks (DNNs) have achieved remarkable performance in computer vision, but larger DNNs create a bottleneck for deploying DNNs on resource-constrained edge devices.
Mingran Li   +3 more
semanticscholar   +1 more source

Network Pruning Spaces [PDF]

open access: yesarXiv.org, 2023
Network pruning techniques, including weight pruning and filter pruning, reveal that most state-of-the-art neural networks can be accelerated without a significant performance drop.
Xuanyu He   +7 more
semanticscholar   +1 more source

Heuristic Method for Minimizing Model Size of CNN by Combining Multiple Pruning Techniques

open access: yesSensors, 2022
Network pruning techniques have been widely used for compressing computational and memory intensive deep learning models through removing redundant components of the model.
Danhe Tian   +2 more
doaj   +1 more source

Home - About - Disclaimer - Privacy