Results 21 to 30 of about 343,225 (391)

Joint Token Pruning and Squeezing Towards More Aggressive Compression of Vision Transformers [PDF]

open access: yesComputer Vision and Pattern Recognition, 2023
Although vision transformers (ViTs) have shown promising results in various computer vision tasks recently, their high computational cost limits their practical applications.
Siyuan Wei   +4 more
semanticscholar   +1 more source

Channel Pruning for Accelerating Very Deep Neural Networks [PDF]

open access: yesIEEE International Conference on Computer Vision, 2017
In this paper, we introduce a new channel pruning method to accelerate very deep convolutional neural networks. Given a trained CNN model, we propose an iterative two-step algorithm to effectively prune each layer, by a LASSO regression based channel ...
Yihui He, Xiangyu Zhang, Jian Sun
semanticscholar   +1 more source

Structured Pruning Learns Compact and Accurate Models [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2022
The growing size of neural language models has led to increased attention in model compression. The two predominant approaches are pruning, which gradually removes weights from a pre-trained model, and distillation, which trains a smaller compact model ...
Mengzhou Xia, Zexuan Zhong, Danqi Chen
semanticscholar   +1 more source

Pruning vs Quantization: Which is Better? [PDF]

open access: yesNeural Information Processing Systems, 2023
Neural network pruning and quantization techniques are almost as old as neural networks themselves. However, to date only ad-hoc comparisons between the two have been published.
Andrey Kuzmin   +4 more
semanticscholar   +1 more source

HRank: Filter Pruning Using High-Rank Feature Map [PDF]

open access: yesComputer Vision and Pattern Recognition, 2020
Neural network pruning offers a promising prospect to facilitate deploying deep neural networks on resource-limited devices. However, existing methods are still challenged by the training inefficiency and labor cost in pruning designs, due to missing ...
Mingbao Lin   +6 more
semanticscholar   +1 more source

ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression [PDF]

open access: yesIEEE International Conference on Computer Vision, 2017
We propose an efficient and unified framework, namely ThiNet, to simultaneously accelerate and compress CNN models in both training and inference stages.
Jian-Hao Luo, Jianxin Wu, Weiyao Lin
semanticscholar   +1 more source

A Fast Post-Training Pruning Framework for Transformers [PDF]

open access: yesNeural Information Processing Systems, 2022
Pruning is an effective way to reduce the huge inference cost of Transformer models. However, prior work on pruning Transformers requires retraining the models.
Woosuk Kwon   +5 more
semanticscholar   +1 more source

Pruning Meets Low-Rank Parameter-Efficient Fine-Tuning

open access: yesarXiv.org, 2023
Large pre-trained models (LPMs), such as LLaMA and ViT-G, have shown exceptional performance across various tasks. Although parameter-efficient fine-tuning (PEFT) has emerged to cheaply fine-tune these large models on downstream tasks, their deployment ...
Mingyang Zhang   +6 more
semanticscholar   +1 more source

The role of pruning in the intensification of plum production

open access: yesInternational Journal of Horticultural Science, 2006
In an orchard planted in the spring of 1997, four kinds of spacing have been applied (4.0 m x 1.5 m, 4.0 m x 2.0 m, 5.0 m x 2.5 in and 6.0 m x 3.0 m). Four cultivars (‘Cacanska lepotica', Stanley' ‘Bluefre' and ‘President') grafted on Myrobalan rootstock
I. Gonda
doaj   +1 more source

Statistical Pruning for Near Maximum Likelihood Detection of MIMO Systems [PDF]

open access: yes, 2007
We show a statistical pruning approach for maximum likelihood (ML) detection of multiple-input multiple-output (MIMO) systems. We present a general pruning strategy for sphere decoder (SD), which can also be applied to any tree search algorithms. Our
Cui, Tao   +2 more
core   +1 more source

Home - About - Disclaimer - Privacy