Results 21 to 30 of about 459,246 (134)
A Computationally Efficient Limited Memory CMA-ES for Large Scale Optimization
We propose a computationally efficient limited memory Covariance Matrix Adaptation Evolution Strategy for large scale optimization, which we call the LM-CMA-ES.
Hansen N. +3 more
core +1 more source
Exploiting persymmetry for low-rank Space Time Adaptive Processing [PDF]
Publication in the conference proceedings of EUSIPCO, Bucharest, Romania ...
Ginolhac, Guillaume +3 more
openaire +3 more sources
Robust language recognition via adaptive language factor extraction [PDF]
This paper presents a technique to adapt an acoustically based language classifier to the background conditions and speaker accents. This adaptation improves language classification on a broad spectrum of TV broadcasts. The core of the system consists of
Demuynck, Kris +2 more
core +1 more source
An Evaluation of Deep CNN Baselines for Scene-Independent Person Re-Identification
In recent years, a variety of proposed methods based on deep convolutional neural networks (CNNs) have improved the state of the art for large-scale person re-identification (ReID). While a large number of optimizations and network improvements have been
Jamieson, Michael +2 more
core +1 more source
WaRA: Wavelet Low Rank Adaptation
Parameter-efficient fine-tuning (PEFT) has gained widespread adoption across various applications. Among PEFT techniques, Low-Rank Adaptation (LoRA) and its extensions have emerged as particularly effective, allowing efficient model adaptation while significantly reducing computational overhead. However, existing approaches typically rely on global low-
Heidari, Moein +4 more
openaire +2 more sources
A Bayesian Interpretation of Adaptive Low-Rank Adaptation
Motivated by the sensitivity-based importance score of the adaptive low-rank adaptation (AdaLoRA), we utilize more theoretically supported metrics, including the signal-to-noise ratio (SNR), along with the Improved Variational Online Newton (IVON) optimizer, for adaptive parameter budget allocation.
Chen, Haolin, Garner, Philip N.
openaire +2 more sources
Adaptive Low-Rank Kernel Subspace Clustering
In this paper, we present a kernel subspace clustering method that can handle non-linear models. In contrast to recent kernel subspace clustering methods which use predefined kernels, we propose to learn a low-rank kernel matrix, with which mapped data in feature space are not only low-rank but also self-expressive.
Ji, Pan +4 more
openaire +2 more sources
The Expressive Power of Low-Rank Adaptation
40 pages, 5 ...
Zeng, Yuchen, Lee, Kangwook
openaire +2 more sources
Norm-Bounded Low-Rank Adaptation
In this work, we propose norm-bounded low-rank adaptation (NB-LoRA) for parameter-efficient fine tuning. NB-LoRA is a novel parameterization of low-rank weight adaptations that admits explicit bounds on each singular value of the adaptation matrix, which can thereby satisfy any prescribed unitarily invariant norm bound, including the Schatten norms (e ...
Wang, Ruigang +2 more
openaire +2 more sources

