Results 11 to 20 of about 12,418,444 (305)

Low‐rank isomap algorithm

open access: yesIET Signal Processing, 2022
Isomap is a well‐known nonlinear dimensionality reduction method that highly suffers from computational complexity. Its computational complexity mainly arises from two stages; a) embedding a full graph on the data in the ambient space, and b) a complete ...
Eysan Mehrbani, Mohammad Hossein Kahaei
doaj   +3 more sources

Beyond Low Rank + Sparse: Multi-scale Low Rank Matrix Decomposition [PDF]

open access: yes2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2016
We present a natural generalization of the recent low rank + sparse matrix decomposition and consider the decomposition of matrices into components of multiple scales.
Lustig, Michael, Ong, Frank
core   +3 more sources

Multi-resolution Low-rank Tensor Formats [PDF]

open access: yesSIAM Journal on Matrix Analysis and Applications, 2020
We describe a simple, black-box compression format for tensors with a multiscale structure. By representing the tensor as a sum of compressed tensors defined on increasingly coarse grids, we capture low-rank structures on each grid-scale, and we show how
Karaman, Sertac, Mickelin, Oscar
core   +2 more sources

Low Rank Regularization: A review [PDF]

open access: yesNeural Networks, 2021
Low rank regularization, in essence, involves introducing a low rank or approximately low rank assumption for matrix we aim to learn, which has achieved great success in many fields including machine learning, data mining and computer version. Over the last decade, much progress has been made in theories and practical applications.
Zhanxuan Hu   +3 more
openaire   +3 more sources

Low-rank Parareal: a low-rank parallel-in-time integrator

open access: yesBIT Numerical Mathematics, 2023
AbstractIn this work, the Parareal algorithm is applied to evolution problems that admit good low-rank approximations and for which the dynamical low-rank approximation (DLRA) can be used as time stepper. Many discrete integrators for DLRA have recently been proposed, based on splitting the projected vector field or by applying projected Runge–Kutta ...
Carrel, Benjamin   +2 more
openaire   +5 more sources

Low rank phase retrieval [PDF]

open access: yes2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2017
To appear in IEEE Trans.
Vaswani, Namrata   +2 more
openaire   +2 more sources

Bayesian low-rank adaptation for large language models [PDF]

open access: yesInternational Conference on Learning Representations, 2023
Low-rank adaptation (LoRA) has emerged as a new paradigm for cost-efficient fine-tuning of large language models (LLMs). However, fine-tuned LLMs often become overconfident especially when fine-tuned on small datasets.
Adam X. Yang   +3 more
semanticscholar   +1 more source

Low‐rank magnetic resonance fingerprinting [PDF]

open access: yesMedical Physics, 2016
PurposeMagnetic resonance fingerprinting (MRF) is a relatively new approach that provides quantitative MRI measures using randomized acquisition. Extraction of physical quantitative tissue parameters is performed offline, without the need of patient presence, based on acquisition with varying parameters and a dictionary generated according to the Bloch
Mazor, Gal   +3 more
openaire   +5 more sources

From low-rank retractions to dynamical low-rank approximation and back. [PDF]

open access: yesBIT Numer Math
AbstractIn algorithms for solving optimization problems constrained to a smooth manifold, retractions are a well-established tool to ensure that the iterates stay on the manifold. More recently, it has been demonstrated that retractions are a useful concept for other computational tasks on manifold as well, including interpolation tasks.
Séguin A, Ceruti G, Kressner D.
europepmc   +5 more sources

Delta-LoRA: Fine-Tuning High-Rank Parameters with the Delta of Low-Rank Matrices [PDF]

open access: yesarXiv.org, 2023
In this paper, we present Delta-LoRA, which is a novel parameter-efficient approach to fine-tune large language models (LLMs). In contrast to LoRA and other low-rank adaptation methods such as AdaLoRA, Delta-LoRA not only updates the low-rank matrices ...
Bojia Zi   +5 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy