Results 51 to 60 of about 227,550 (285)

Naturalness of Neutralino Dark Matter

open access: yes, 2013
We investigate the level of fine-tuning of neutralino Dark Matter below 200 GeV in the low-energy phenomenological minimal supersymmetric Standard Model taking into account the newest results from XENON100 and the Large Hadron Collider as well as all ...
Grothaus, Philipp   +2 more
core   +1 more source

Parameter-Efficient Fine-Tuning With Adapters

open access: yes
In the arena of language model fine-tuning, the traditional approaches, such as Domain-Adaptive Pretraining (DAPT) and Task-Adaptive Pretraining (TAPT), although effective, but computational intensive. This research introduces a novel adaptation method utilizing the UniPELT framework as a base and added a PromptTuning Layer, which significantly reduces
Chen, Keyu, Pang, Yuan, Yang, Zi
openaire   +2 more sources

Predicting Epileptogenic Tubers in Patients With Tuberous Sclerosis Complex Using a Fusion Model Integrating Lesion Network Mapping and Machine Learning

open access: yesAnnals of Clinical and Translational Neurology, EarlyView.
ABSTRACT Objective Accurate localization of epileptogenic tubers (ETs) in patients with tuberous sclerosis complex (TSC) is essential but challenging, as these tubers lack distinct pathological or genetic markers to differentiate them from other cortical tubers.
Tinghong Liu   +11 more
wiley   +1 more source

Strong Baselines for Parameter-Efficient Few-Shot Fine-Tuning

open access: yesProceedings of the AAAI Conference on Artificial Intelligence
Few-shot classification (FSC) entails learning novel classes given only a few examples per class after a pre-training (or meta-training) phase on a set of base classes. Recent works have shown that simply fine-tuning a pre-trained Vision Transformer (ViT) on new test classes is a strong approach for FSC. Fine-tuning ViTs, however, is expensive in time,
Basu, Samyadeep   +3 more
openaire   +2 more sources

A Systematic Comparison of Alpha‐Synuclein Seed Amplification Assays for Increasing Reproducibility

open access: yesAnnals of Clinical and Translational Neurology, EarlyView.
ABSTRACT Seed amplification assays (SAAs) enable ultrasensitive detection of misfolded α‐synuclein across biofluids and tissues. Yet, heterogeneity in protocols limits cross‐study comparability and clinical translation. Here, we review α‐synuclein SAA methods and their performance across various biological matrices.
Manuela Amaral‐do‐Nascimento   +3 more
wiley   +1 more source

PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning

open access: yes, 2018
This paper presents a method for adding multiple tasks to a single deep neural network while avoiding catastrophic forgetting. Inspired by network pruning techniques, we exploit redundancies in large deep networks to free up parameters that can then be ...
Lazebnik, Svetlana, Mallya, Arun
core   +1 more source

A Robust Adaptive One‐Sample‐Ahead Preview Super‐Twisting Sliding Mode Controller

open access: yesInternational Journal of Adaptive Control and Signal Processing, EarlyView.
Block Diagram of the Robust Adaptive One‐Sample‐Ahead Preview Super‐Twisting Sliding Mode Controller. ABSTRACT This article introduces a discrete‐time robust adaptive one‐sample‐ahead preview super‐twisting sliding mode controller. A stability analysis of the controller by Lyapunov criteria is developed to demonstrate its robustness in handling both ...
Guilherme Vieira Hollweg   +5 more
wiley   +1 more source

FiLM: Visual Reasoning with a General Conditioning Layer

open access: yes, 2017
We introduce a general-purpose conditioning method for neural networks called FiLM: Feature-wise Linear Modulation. FiLM layers influence neural network computation via a simple, feature-wise affine transformation based on conditioning information.
Courville, Aaron   +4 more
core   +1 more source

Bistable Mechanisms 3D Printing for Mechanically Programmable Vibration Control

open access: yesAdvanced Engineering Materials, EarlyView.
This work introduces a 3D‐printed bistable mechanism integrated into tuned mass dampers (TMDs) for mechanically adaptive passive vibration suppression. Through optimized geometry, the bistable design provides adaptable vibration reduction across a broad range of scenarios, achieving effective vibration mitigation without complex controls or external ...
Ali Zolfagharian   +4 more
wiley   +1 more source

Comparison between parameter-efficient techniques and full fine-tuning: A case study on multilingual news article classification.

open access: yesPLoS ONE
Adapters and Low-Rank Adaptation (LoRA) are parameter-efficient fine-tuning techniques designed to make the training of language models more efficient. Previous results demonstrated that these methods can even improve performance on some classification ...
Olesya Razuvayevskaya   +7 more
doaj   +1 more source

Home - About - Disclaimer - Privacy