Results 281 to 290 of about 13,997,719 (325)
Some of the next articles are maybe not open access.
Spatial-Spectral Structured Sparse Low-Rank Representation for Hyperspectral Image Super-Resolution
IEEE Transactions on Image Processing, 2021Hyperspectral image super-resolution by fusing high-resolution multispectral image (HR-MSI) and low-resolution hyperspectral image (LR-HSI) aims at reconstructing high resolution spatial-spectral information of the scene. Existing methods mostly based on
Jize Xue +5 more
semanticscholar +1 more source
Robust Low-Rank Tensor Recovery with Rectification and Alignment
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021Low-rank tensor recovery in the presence of sparse but arbitrary errors is an important problem with many practical applications. In this work, we propose a general framework that recovers low-rank tensors, in which the data can be deformed by some ...
Xiaoqin Zhang +3 more
semanticscholar +1 more source
LoRA-GA: Low-Rank Adaptation with Gradient Approximation
Neural Information Processing SystemsFine-tuning large-scale pretrained models is prohibitively expensive in terms of computational and memory costs. LoRA, as one of the most popular Parameter-Efficient Fine-Tuning (PEFT) methods, offers a cost-effective alternative by fine-tuning an ...
Shaowen Wang, Linxi Yu, Jian Li
semanticscholar +1 more source
Flora: Low-Rank Adapters Are Secretly Gradient Compressors
International Conference on Machine LearningDespite large neural networks demonstrating remarkable abilities to complete different tasks, they require excessive memory usage to store the optimization states for training.
Yongchang Hao, Yanshuai Cao, Lili Mou
semanticscholar +1 more source
Asymmetry in Low-Rank Adapters of Foundation Models
International Conference on Machine LearningParameter-efficient fine-tuning optimizes large, pre-trained foundation models by updating a subset of parameters; in this class, Low-Rank Adaptation (LoRA) is particularly effective.
Jiacheng Zhu +8 more
semanticscholar +1 more source
2012
Matrix low-rank approximation is intimately related to data modelling; a problem that arises frequently in many different fields. Low Rank Approximation: Algorithms, Implementation, Applications is a comprehensive exposition of the theory, algorithms, and applications of structured low-rank approximation.
openaire +2 more sources
Matrix low-rank approximation is intimately related to data modelling; a problem that arises frequently in many different fields. Low Rank Approximation: Algorithms, Implementation, Applications is a comprehensive exposition of the theory, algorithms, and applications of structured low-rank approximation.
openaire +2 more sources
Structure-Constrained Low-Rank Representation
IEEE Transactions on Neural Networks and Learning Systems, 2014Benefiting from its effectiveness in subspace segmentation, low-rank representation (LRR) and its variations have many applications in computer vision and pattern recognition, such as motion segmentation, image segmentation, saliency detection, and semisupervised learning.
Kewei, Tang +3 more
openaire +2 more sources
Selective Aggregation for Low-Rank Adaptation in Federated Learning
International Conference on Learning RepresentationsWe investigate LoRA in federated learning through the lens of the asymmetry analysis of the learned $A$ and $B$ matrices. In doing so, we uncover that $A$ matrices are responsible for learning general knowledge, while $B$ matrices focus on capturing ...
Pengxin Guo +5 more
semanticscholar +1 more source
LoRA-Pro: Are Low-Rank Adapters Properly Optimized?
International Conference on Learning RepresentationsLow-rank adaptation, also known as LoRA, has emerged as a prominent method for parameter-efficient fine-tuning of foundation models. Despite its computational efficiency, LoRA still yields inferior performance compared to full fine-tuning. In this paper,
Zhengbo Wang, Jian Liang
semanticscholar +1 more source
Low‐rank revealing QR factorizations
Numerical Linear Algebra with Applications, 1994AbstractRank revealing factorizations are used extensively in signal processing in connection with, for example, linear prediction and signal subspace algorithms. We present an algorithm for computing rank revealing QR factorizations of low‐rank matrices.
Chan, Tony F., Hansen, Per Christian
openaire +1 more source

