In this paper, we consider convolutional neural networks operating on sparse inputs with an application to depth upsampling from sparse laser scan data.
Brox, Thomas +5 more
core +2 more sources
Outlier Weighed Layerwise Sparsity (OWL): A Missing Secret Sauce for Pruning LLMs to High Sparsity [PDF]
Large Language Models (LLMs), renowned for their remarkable performance across diverse domains, present a challenge when it comes to practical deployment due to their colossal model size.
Lu Yin +9 more
semanticscholar +1 more source
More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity [PDF]
Transformers have quickly shined in the computer vision world since the emergence of Vision Transformers (ViTs). The dominant role of convolutional neural networks (CNNs) seems to be challenged by increasingly effective transformer-based models.
Shiwei Liu +8 more
semanticscholar +1 more source
Channel Estimation for RIS Assisted Wireless Communications—Part II: An Improved Solution Based on Double-Structured Sparsity [PDF]
Reconfigurable intelligent surface (RIS) can manipulate the wireless communication environment by controlling the coefficients of RIS elements. However, due to the large number of passive RIS elements without signal processing capability, channel ...
Xiuhong Wei, Decai Shen, L. Dai
semanticscholar +1 more source
Learning Best Combination for Efficient N: M Sparsity [PDF]
By forcing at most N out of M consecutive weights to be non-zero, the recent N:M network sparsity has received increasing attention for its two attractive advantages: 1) Promising performance at a high sparsity.
Yu-xin Zhang +7 more
semanticscholar +1 more source
S2TA: Exploiting Structured Sparsity for Energy-Efficient Mobile CNN Acceleration [PDF]
Exploiting sparsity is a key technique in accelerating quantized convolutional neural network (CNN) inference on mobile devices. Prior sparse CNN accelerators largely exploit unstructured sparsity and achieve significant speedups.
Zhi-gang Liu +3 more
semanticscholar +1 more source
Exploring Sparsity in Image Super-Resolution for Efficient Inference
Current CNN-based super-resolution (SR) methods process all locations equally with computational resources being uniformly assigned in space. However, since missing details in low-resolution (LR) images mainly exist in regions of edges and textures, less
Longguang Wang +6 more
semanticscholar +1 more source
Enhancing Sparsity by Reweighted ℓ1 Minimization [PDF]
It is now well understood that (1) it is possible to reconstruct sparse signals exactly from what appear to be highly incomplete sets of linear measurements and (2) that this can be done by constrained ℓ1 minimization.
E. Candès, M. Wakin, Stephen P. Boyd
semanticscholar +1 more source
An iterative thresholding algorithm for linear inverse problems with a sparsity constraint [PDF]
We consider linear inverse problems where the solution is assumed to have a sparse expansion on an arbitrary preassigned orthonormal basis. We prove that replacing the usual quadratic regularizing penalties by weighted 𝓁p‐penalties on the coefficients of
I. Daubechies, M. Defrise, C. D. Mol
semanticscholar +1 more source
Multilayer Sparsity-Based Tensor Decomposition for Low-Rank Tensor Completion
Existing methods for tensor completion (TC) have limited ability for characterizing low-rank (LR) structures. To depict the complex hierarchical knowledge with implicit sparsity attributes hidden in a tensor, we propose a new multilayer sparsity-based ...
Jize Xue +5 more
semanticscholar +1 more source

