Results 1 to 10 of about 12,418,444 (305)

Low-Rank Gradient Descent

open access: yesIEEE Open Journal of Control Systems, 2023
Several recent empirical studies demonstrate that important machine learning tasks such as training deep neural networks, exhibit a low-rank structure, where most of the variation in the loss function occurs only in a few directions of the input space ...
Romain Cosson   +4 more
doaj   +2 more sources

Neural oscillation in low-rank SNNs: bridging network dynamics and cognitive function [PDF]

open access: yesFrontiers in Computational Neuroscience
Neural oscillation, particularly gamma oscillation, are fundamental to cognitive processes such as attention, perception, and decision-making. Experimental studies have shown that the phase of gamma oscillation modulates neuronal response selectivity ...
Bin Li   +5 more
doaj   +2 more sources

Mix-of-Show: Decentralized Low-Rank Adaptation for Multi-Concept Customization of Diffusion Models [PDF]

open access: yesNeural Information Processing Systems, 2023
Public large-scale text-to-image diffusion models, such as Stable Diffusion, have gained significant attention from the community. These models can be easily customized for new concepts using low-rank adaptations (LoRAs).
Yuchao Gu   +12 more
semanticscholar   +1 more source

Macerals of lignite and the effect of alkali treatment on the structure and combustion performance of lignite

open access: yesMeitan xuebao, 2023
Suppressing the spontaneous combustion of lignite is of great significance for safe transportation and efficient utilization of lignite. Taking the Shengli lignite as the research object, two different macerals, inertinite and huminite, were selected by ...
Yanjun WANG   +7 more
doaj   +1 more source

LoRA-FA: Memory-efficient Low-rank Adaptation for Large Language Models Fine-tuning [PDF]

open access: yesarXiv.org, 2023
The low-rank adaptation (LoRA) method can largely reduce the amount of trainable parameters for fine-tuning large language models (LLMs), however, it still requires expensive activation memory to update low-rank weights.
Longteng Zhang   +4 more
semanticscholar   +1 more source

QA-LoRA: Quantization-Aware Low-Rank Adaptation of Large Language Models [PDF]

open access: yesInternational Conference on Learning Representations, 2023
Recently years have witnessed a rapid development of large language models (LLMs). Despite the strong ability in many language-understanding tasks, the heavy computational burden largely restricts the application of LLMs especially when one needs to ...
Yuhui Xu   +8 more
semanticscholar   +1 more source

DyLoRA: Parameter-Efficient Tuning of Pre-trained Models using Dynamic Search-Free Low-Rank Adaptation [PDF]

open access: yesConference of the European Chapter of the Association for Computational Linguistics, 2022
With the ever-growing size of pretrained models (PMs), fine-tuning them has become more expensive and resource-hungry. As a remedy, low-rank adapters (LoRA) keep the main pretrained weights of the model frozen and just introduce some learnable truncated ...
Mojtaba Valipour   +3 more
semanticscholar   +1 more source

Robust Recovery of Subspace Structures by Low-Rank Representation [PDF]

open access: yesIEEE Transactions on Pattern Analysis and Machine Intelligence, 2010
In this paper, we address the subspace clustering problem. Given a set of data samples (vectors) approximately drawn from a union of multiple subspaces, our goal is to cluster the samples into their respective subspaces and remove possible outliers as ...
Guangcan Liu   +5 more
semanticscholar   +1 more source

LoSparse: Structured Compression of Large Language Models based on Low-Rank and Sparse Approximation [PDF]

open access: yesInternational Conference on Machine Learning, 2023
Transformer models have achieved remarkable results in various natural language tasks, but they are often prohibitively large, requiring massive memories and computational resources. To reduce the size and complexity of these models, we propose LoSparse (
Yixiao Li   +6 more
semanticscholar   +1 more source

Research Progress on Electro-Chemical Oxidation of Low-Rank Coal to Humic Acid

open access: yes南方能源建设, 2022
[Introduction] With the vigorous development of renewable energy, clean and efficient utilization of low-rank coal not only improves resource utilization and economic value but also has great social significance.
Ruzhan BAI   +6 more
doaj   +1 more source

Home - About - Disclaimer - Privacy