Results 281 to 290 of about 12,418,444 (305)
Some of the next articles are maybe not open access.
Assessing the Brittleness of Safety Alignment via Pruning and Low-Rank Modifications
International Conference on Machine LearningLarge language models (LLMs) show inherent brittleness in their safety mechanisms, as evidenced by their susceptibility to jailbreaking and even non-malicious fine-tuning. This study explores this brittleness of safety alignment by leveraging pruning and
Boyi Wei +8 more
semanticscholar +1 more source
Robust Low-Rank Tensor Recovery with Rectification and Alignment
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021Low-rank tensor recovery in the presence of sparse but arbitrary errors is an important problem with many practical applications. In this work, we propose a general framework that recovers low-rank tensors, in which the data can be deformed by some ...
Xiaoqin Zhang +3 more
semanticscholar +1 more source
Low-Rank Multilinear Filtering
Digital Signal ProcessingPublished by Elsevier Digital Signal Processing. ; International audience ; Linear filtering methods are well-known and have been successfully applied to system identification and equalization problems. However, when high-dimensional systems are modeled, these methods often perform unsatisfactorily due to their slow convergence and to the high number ...
Maryam Dehghan +2 more
openaire +2 more sources
A rank-adaptive robust integrator for dynamical low-rank approximation
BIT Numerical Mathematics, 2021A rank-adaptive integrator for the dynamical low-rank approximation of matrix and tensor differential equations is presented. The fixed-rank integrator recently proposed by two of the authors is extended to allow for an adaptive choice of the rank, using
Gianluca Ceruti, J. Kusch, C. Lubich
semanticscholar +1 more source
Flora: Low-Rank Adapters Are Secretly Gradient Compressors
International Conference on Machine LearningDespite large neural networks demonstrating remarkable abilities to complete different tasks, they require excessive memory usage to store the optimization states for training.
Yongchang Hao, Yanshuai Cao, Lili Mou
semanticscholar +1 more source
SVDQuant: Absorbing Outliers by Low-Rank Components for 4-Bit Diffusion Models
arXiv.orgDiffusion models can effectively generate high-quality images. However, as they scale, rising memory demands and higher latency pose substantial deployment challenges.
Muyang Li +9 more
semanticscholar +1 more source
LoRA-GA: Low-Rank Adaptation with Gradient Approximation
Neural Information Processing SystemsFine-tuning large-scale pretrained models is prohibitively expensive in terms of computational and memory costs. LoRA, as one of the most popular Parameter-Efficient Fine-Tuning (PEFT) methods, offers a cost-effective alternative by fine-tuning an ...
Shaowen Wang, Linxi Yu, Jian Li
semanticscholar +1 more source
Asymmetry in Low-Rank Adapters of Foundation Models
International Conference on Machine LearningParameter-efficient fine-tuning optimizes large, pre-trained foundation models by updating a subset of parameters; in this class, Low-Rank Adaptation (LoRA) is particularly effective.
Jiacheng Zhu +8 more
semanticscholar +1 more source
2012
Matrix low-rank approximation is intimately related to data modelling; a problem that arises frequently in many different fields. Low Rank Approximation: Algorithms, Implementation, Applications is a comprehensive exposition of the theory, algorithms, and applications of structured low-rank approximation.
openaire +2 more sources
Matrix low-rank approximation is intimately related to data modelling; a problem that arises frequently in many different fields. Low Rank Approximation: Algorithms, Implementation, Applications is a comprehensive exposition of the theory, algorithms, and applications of structured low-rank approximation.
openaire +2 more sources
Structure-Constrained Low-Rank Representation
IEEE Transactions on Neural Networks and Learning Systems, 2014Benefiting from its effectiveness in subspace segmentation, low-rank representation (LRR) and its variations have many applications in computer vision and pattern recognition, such as motion segmentation, image segmentation, saliency detection, and semisupervised learning.
Kewei, Tang +3 more
openaire +2 more sources

