Results 11 to 20 of about 1,983 (102)
Super-resolution of positive spikes by Toeplitz low-rank approximation [PDF]
Publication in the conference proceedings of EUSIPCO, Nice, France ...
Condat, Laurent, Hirabayashi, Akira
openaire +1 more source
Sublinear Time Low-Rank Approximation of Positive Semidefinite Matrices [PDF]
We show how to compute a relative-error low-rank approximation to any positive semidefinite (PSD) matrix in sublinear time, i.e., for any $n \times n$ PSD matrix $A$, in $\tilde O(n \cdot poly(k/ ))$ time we output a rank-$k$ matrix $B$, in factored form, for which $\|A-B\|_F^2 \leq (1+ )\|A-A_k\|_F^2$, where $A_k$ is the best rank-$k$ approximation ...
Musco, Cameron, Woodruff, David P.
openaire +2 more sources
Positive semi-definite matrices commonly occur as normal matrices of least squares problems in statistics or as kernel matrices in machine learning and approximation theory. They are typically large and dense. Thus algorithms to solve systems with such a matrix can be very costly.
Markus Hegland, Frank De Hoog
openaire +2 more sources
A Class of Weighted Low Rank Approximation of the Positive Semidefinite Hankel Matrix [PDF]
We consider the weighted low rank approximation of the positive semidefinite Hankel matrix problem arising in signal processing. By using the Vandermonde representation, we firstly transform the problem into an unconstrained optimization problem and then use the nonlinear conjugate gradient algorithm with the Armijo line search to solve the equivalent ...
Jianchao Bai +3 more
openaire +4 more sources
Efficient Radio Map Construction Based on Low-Rank Approximation for Indoor Positioning [PDF]
Fingerprint-based positioning in a wireless local area network (WLAN) environment has received much attention recently. One key issue for the positioning method is the radio map construction, which generally requires significant effort to collect enough measurements of received signal strength (RSS). Based on the observation that RSSs have high spatial
Yongli Hu +4 more
openaire +1 more source
Low-Rank Approximation and Completion of Positive Tensors [PDF]
Unlike the matrix case, computing low-rank approximations of tensors is NP-hard and numerically ill-posed in general. Even the best rank-1 approximation of a tensor is NP-hard. In this paper, we use convex optimization to develop polynomial-time algorithms for low-rank approximation and completion of positive tensors.
openaire +3 more sources
Low rank approximation of the symmetric positive semidefinite matrix
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Duan, Xuefeng +3 more
openaire +2 more sources
Positive semi-definite matrices commonly occur as normal matrices of least squares problems in statistics or as kernel matrices in machine learning and approximation theory. They are typically large and dense. Thus algorithms to solve systems with such a matrix can be very costly.
Hegland, Markus, deHoog, Frank
openaire +2 more sources
Low-Rank Positive Approximants of Symmetric Matrices
Given a symmetric matrix X, we consider the problem of finding a low-rank positive approximant of X. That is, a symmetric positive semidefinite matrix, S, whose rank is smaller than a given positive integer, , which is nearest to X in a certain matrix norm. The problem is first solved with regard to four common norms: The Frobenius norm, the Schatten p-
openaire +1 more source
Generalized low-rank approximation to the symmetric positive semidefinite matrix
In this paper, we investigate the generalized low rank approximation to the symmetric positive semidefinite matrix in the Frobenius norm: $$\underset{ rank(X)\leq k}{\min} \sum^m_{i=1}\left \Vert A_i - B_i XB_i^T \right \Vert^2_F,$$ where $X$ is an unknown symmetric positive semidefinite matrix and $k$ is a positive integer. We firstly use the property
Chang, Haixia +2 more
openaire +2 more sources

