Results 61 to 70 of about 226,203 (262)
Expectations for Inflationary Observables: Simple or Natural?
We describe the general inflationary dynamics that can arise with a single, canonically coupled field where the inflaton potential is a 4-th order polynomial. This scenario yields a wide range of combinations of the empirical spectral observables, $n_s$,
Easther, Richard, Musoke, Nathan
core +1 more source
ABSTRACT Objective Peripheral neuropathies contribute to patient disability but may be diagnosed late or missed altogether due to late referral, limitation of current diagnostic methods and lack of specialized testing facilities. To address this clinical gap, we developed NeuropathAI, an interpretable deep learning–based multiclass classification ...
Chaima Ben Rabah +7 more
wiley +1 more source
Parameter-Efficient Fine-Tuning of Large Pretrained Models for Instance Segmentation Tasks
Research and applications in artificial intelligence have recently shifted with the rise of large pretrained models, which deliver state-of-the-art results across numerous tasks.
Nermeen Abou Baker +2 more
doaj +1 more source
Fine-Tune Smarter, Not Harder: Parameter-Efficient Fine-Tuning for Geospatial Foundation Models
Earth observation (EO) is crucial for monitoring environmental changes, responding to disasters, and managing natural resources. In this context, foundation models facilitate remote sensing image analysis to retrieve relevant geoinformation accurately and efficiently.
Marti-Escofet, Francesc +4 more
openaire +2 more sources
ABSTRACT Objective Accurate localization of epileptogenic tubers (ETs) in patients with tuberous sclerosis complex (TSC) is essential but challenging, as these tubers lack distinct pathological or genetic markers to differentiate them from other cortical tubers.
Tinghong Liu +11 more
wiley +1 more source
Efficient Side-Tuning for Remote Sensing: A Low-Memory Fine-Tuning Framework
Fine-tuning pretrained models for remote sensing tasks often demands substantial computational resources. To reduce memory requirements and training costs, this article proposes a low-memory fine-tuning framework, called efficient side-tuning (EST), for ...
Haichen Yu +6 more
doaj +1 more source
A Robust Adaptive One‐Sample‐Ahead Preview Super‐Twisting Sliding Mode Controller
Block Diagram of the Robust Adaptive One‐Sample‐Ahead Preview Super‐Twisting Sliding Mode Controller. ABSTRACT This article introduces a discrete‐time robust adaptive one‐sample‐ahead preview super‐twisting sliding mode controller. A stability analysis of the controller by Lyapunov criteria is developed to demonstrate its robustness in handling both ...
Guilherme Vieira Hollweg +5 more
wiley +1 more source
Activation-Guided Low-Rank Parameter Adaptation for Efficient Model Fine-Tuning
Fine-tuning large language models is computationally expensive, and while existing parameter-efficient methods like Low-Rank Adaptation (LoRA) reduce computational costs, they are limited by suboptimal initialization strategies.
Qingchen Wang, Shengyu Shen
doaj +1 more source
Parameter-Efficient Fine-Tuning for Foundation Models
25 pages, 6 figures, 7 ...
Zhang, Dan +5 more
openaire +2 more sources
SVFT: Parameter-Efficient Fine-Tuning with Singular Vectors
Popular parameter-efficient fine-tuning (PEFT) methods, such as LoRA and its variants, freeze pre-trained model weights \(W\) and inject learnable matrices \(ΔW\). These \(ΔW\) matrices are structured for efficient parameterization, often using techniques like low-rank approximations or scaling vectors.
Lingam, Vijay +9 more
openaire +2 more sources

