Results 231 to 240 of about 223,248 (273)
Some of the next articles are maybe not open access.

Laplacian Sparse Coding, Hypergraph Laplacian Sparse Coding, and Applications

IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013
Sparse coding exhibits good performance in many computer vision applications. However, due to the overcomplete codebook and the independent coding process, the locality and the similarity among the instances to be encoded are lost. To preserve such locality and similarity information, we propose a Laplacian sparse coding (LSc) framework.
Shenghua, Gao   +2 more
openaire   +2 more sources

Order Preserving Sparse Coding

IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015
In this paper, we investigate order-preserving sparse coding for classifying structured data whose atomic features possess ordering relationships. Examples include time sequences where individual frame-wise features are temporally ordered, as well as still images (landscape, street view, etc.) where different regions of the image are spatially ordered.
Bingbing, Ni   +2 more
openaire   +2 more sources

Hessian sparse coding

Neurocomputing, 2014
Sparse coding has received an increasing amount of interest in recent years. It finds a basis set that captures high-level semantics in the data and learns sparse coordinates in terms of the basis set. However, most of the existing approaches fail to consider the geometrical structure of the data space.
Miao Zheng, Jiajun Bu, Chun Chen
openaire   +1 more source

Sparse code motion

Proceedings of the 27th ACM SIGPLAN-SIGACT symposium on Principles of programming languages, 2000
In this article, we add a third dimension to partial redundancy elimination by considering code size as a further optimization goal in addition to the more classical consideration of computation costs and register pressure. This results in a family of sparse code motion algorithms coming as modular extensions of the algorithms for busy and lazy code ...
Oliver Rüthing   +2 more
openaire   +1 more source

Auditory Sparse Coding

2011
1.1 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.2.1 The stabilized auditory image . . . . . . . . . . . . . .
Steven Ness   +2 more
openaire   +2 more sources

Sparse Autoencoder for Sparse Code Multiple Access

2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), 2021
In the forthcoming 5G technology, Sparse Code Multiple Access (SCMA) is the most promising scheme that aims at improving spectral efficiency further and providing massive connectivity. The challenge behind implementing SCMA scheme is: constructing optimized codebooks in order to obtain minimum BER while keeping the receiver complexity minimum.
Medini Singh, Deepak Mishra, M. Vanidevi
openaire   +1 more source

Product Sparse Coding

2014 IEEE Conference on Computer Vision and Pattern Recognition, 2014
Sparse coding is a widely involved technique in computer vision. However, the expensive computational cost can hamper its applications, typically when the codebook size must be limited due to concerns on running time. In this paper, we study a special case of sparse coding in which the codebook is a Cartesian product of two subcodebooks.
Tiezheng Ge, Kaiming He, Jian Sun
openaire   +1 more source

Differential Sparse Coding

2008
Prior work has shown that features which appear to be biologically plausible as well as empirically useful can be found by sparse coding with a prior such as a laplacian (L1 ) that promotes sparsity. We show how smoother priors can preserve the benefits of these sparse priors while adding stability to the Maximum A-Posteriori (MAP) estimate that makes ...
Bradley, David M., J. Andrew Bagnell
openaire   +1 more source

Ternary Sparse Coding

2012
We study a novel sparse coding model with discrete and symmetric prior distribution. Instead of using continuous latent variables distributed according to heavy tail distributions, the latent variables of our approach are discrete. In contrast to approaches using binary latents, we use latents with three states (-1, 0, and 1) following a symmetric and ...
Georgios Exarchakis   +3 more
openaire   +1 more source

SC2Net: Sparse LSTMs for Sparse Coding

Proceedings of the AAAI Conference on Artificial Intelligence, 2018
The iterative hard-thresholding algorithm (ISTA) is one of the most popular optimization solvers to achieve sparse codes. However, ISTA suffers from following problems: 1) ISTA employs non-adaptive updating strategy to learn the parameters on each dimension with a fixed learning rate. Such a strategy may lead to inferior performance due
Joey Tianyi Zhou   +9 more
openaire   +1 more source

Home - About - Disclaimer - Privacy