Results 71 to 80 of about 227,550 (285)

Activation-Guided Low-Rank Parameter Adaptation for Efficient Model Fine-Tuning

open access: yesIEEE Access
Fine-tuning large language models is computationally expensive, and while existing parameter-efficient methods like Low-Rank Adaptation (LoRA) reduce computational costs, they are limited by suboptimal initialization strategies.
Qingchen Wang, Shengyu Shen
doaj   +1 more source

Parameter-Efficient Continual Fine-Tuning: A Survey

open access: yes
The emergence of large pre-trained networks has revolutionized the AI field, unlocking new possibilities and achieving unprecedented performance. However, these models inherit a fundamental limitation from traditional Machine Learning approaches: their strong dependence on the \textit{i.i.d.} assumption hinders their adaptability to dynamic learning ...
Coleman, Eric Nuertey   +6 more
openaire   +2 more sources

Gradient Inversion Attacks on Parameter-Efficient Fine-Tuning

open access: yes2025 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
2025 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR 2025)
Sami, Hasin Us   +4 more
openaire   +2 more sources

What Do Large Language Models Know About Materials?

open access: yesAdvanced Engineering Materials, EarlyView.
If large language models (LLMs) are to be used inside the material discovery and engineering process, they must be benchmarked for the accurateness of intrinsic material knowledge. The current work introduces 1) a reasoning process through the processing–structure–property–performance chain and 2) a tool for benchmarking knowledge of LLMs concerning ...
Adrian Ehrenhofer   +2 more
wiley   +1 more source

Swin-TUNA: A Novel PEFT Approach for Accurate Food Image Segmentation

open access: yesIEEE Access
In food image processing, parameter-efficient semantic segmentation is important for high-performance applications under constrained training resources.
Haotian Chen, Zhiyong Xiao
doaj   +1 more source

Quantum-PEFT: Ultra parameter-efficient fine-tuning

open access: yes
ICLR ...
Koike-Akino, Toshiaki   +5 more
openaire   +2 more sources

A Workflow to Accelerate Microstructure‐Sensitive Fatigue Life Predictions

open access: yesAdvanced Engineering Materials, EarlyView.
This study introduces a workflow to accelerate predictions of microstructure‐sensitive fatigue life. Results from frameworks with varying levels of simplification are benchmarked against published reference results. The analysis reveals a trade‐off between accuracy and model complexity, offering researchers a practical guide for selecting the optimal ...
Luca Loiodice   +2 more
wiley   +1 more source

The Status of Neutralino Dark Matter

open access: yes, 2013
The lightest neutralino in supersymmetry is the most studied dark matter candidate. This writeup reviews the status of neutralino dark matter in minimal and nonminimal supersymmetric models in light of recent null results at the XENON100 experiment and ...
Shakya, Bibhushan
core   +1 more source

Elinvar Materials: Recent Progress and Challenges

open access: yesAdvanced Engineering Materials, EarlyView.
Elinvar materials, exhibiting temperature‐invariant elastic modulus, are critical for precision instruments and emerging technologies. This article reviews recent progress in the field, with a focus on the anomalous thermoelastic behavior observed in key material systems.
Wenjie Li, Yang Ren
wiley   +1 more source

Symbiotic Tuning: A Simple Approach for Enhancing Task Performance of Side-Tuning

open access: yesIEEE Access
Reducing computational and memory overhead in fine-tuning large language models remains a significant challenge in natural language processing. While parameter-efficient fine-tuning (PEFT) methods, such as LoRA, have gained attention for reducing ...
Zhi-Quan Feng   +3 more
doaj   +1 more source

Home - About - Disclaimer - Privacy