Results 21 to 30 of about 256,417 (191)
Spectral Theory of Sparse Non-Hermitian Random Matrices [PDF]
Sparse non-Hermitian random matrices arise in the study of disordered physical systems with asymmetric local interactions, and have applications ranging from neural networks to ecosystem dynamics.
Abou-Chacra R +38 more
core +3 more sources
Some results on sparse matrices [PDF]
A comparison in the context of sparse matrices is made between the Product Form of the Inverse PFI (a form of Gauss-Jordan elimination) and the Elimination Form of the Inverse EFI (a form of Gaussian elimination). The precise relation of the elements of these two forms of the inverse is given in terms of the nontrivial elements of the three matricesLL ...
Brayton, Robert K. +2 more
openaire +2 more sources
New Orthogonal Transforms for Signal and Image Processing
In the paper, orthogonal transforms based on proposed symmetric, orthogonal matrices are created. These transforms can be considered as generalized Walsh–Hadamard Transforms.
Andrzej Dziech
doaj +1 more source
Sparse matrices in frame theory [PDF]
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Krahmer, Felix +2 more
openaire +4 more sources
This paper considers several algorithms for parallelizing the procedure of forward and back substitution for high-order symmetric sparse matrices on multi-core computers with shared memory.
Sergiy Fialko
doaj +1 more source
Sparse Matrix Based Low-Complexity, Recursive, and Radix-2 Algorithms for Discrete Sine Transforms
This paper presents factorizations of each discrete sine transform (DST) matrix of types I, II, III, and IV into a product of sparse, diagonal, bidiagonal, and scaled orthogonal matrices.
Sirani M. Perera, Levi E. Lingsch
doaj +1 more source
Given a polynomial $p(z)$, a companion matrix can be thought of as a simple template for placing the coefficients of $p(z)$ in a matrix such that the characteristic polynomial is $p(z)$. The Frobenius companion and the more recently-discovered Fiedler companion matrices are examples.
Deaett, Louis +3 more
openaire +2 more sources
Insights from classifying visual concepts with multiple kernel learning. [PDF]
Combining information from various image features has become a standard technique in concept recognition tasks. However, the optimal way of fusing the resulting kernel functions is usually unknown in practical applications. Multiple kernel learning (MKL)
Alexander Binder +7 more
doaj +1 more source
Lower bounds for sparse matrix vector multiplication on hypercubic networks [PDF]
In this paper we consider the problem of computing on a local memory machine the product y = Ax,where A is a random n×n sparse matrix with Θ(n) nonzero elements.
Giovanni Manzini
doaj +2 more sources
Enabling Massive Deep Neural Networks with the GraphBLAS
Deep Neural Networks (DNNs) have emerged as a core tool for machine learning. The computations performed during DNN training and inference are dominated by operations on the weight matrices describing the DNN.
Kepner, Jeremy +5 more
core +1 more source

