Results 91 to 100 of about 226,203 (262)
Whisper is a transformer-based multilingual model that has illustrated state-of-the-art behavior in numerous languages. However, the efficiency remains persistent with the limited computational resources.
Hadia Arshad +5 more
doaj +1 more source
DeepMarks: A Digital Fingerprinting Framework for Deep Neural Networks [PDF]
This paper proposes DeepMarks, a novel end-to-end framework for systematic fingerprinting in the context of Deep Learning (DL). Remarkable progress has been made in the area of deep learning.
Chen, Huili +2 more
core +1 more source
Parameter-Efficient Fine-Tuning of State Space Models
Accepted at ICML 2025.
Galim, Kevin +4 more
openaire +2 more sources
In this study, the interplay of dipolar dynamics and ionic charge transport in MOF compounds is investigated. Synthesizing the novel structure CFA‐25 with integrated freely rotating dipolar groups, local and macroscopic effects, including interactions with Cs cations are explored.
Ralph Freund +6 more
wiley +1 more source
Parameter efficient vs full fine-tuning for building children’s myopia prediction models
Background and objective:: The prevalence of myopia is increasing globally, with projections suggesting that by 2050, half of the population could be affected and 10% may experience high myopia.
Elena Ros-Sánchez +6 more
doaj +1 more source
MambaPEFT: Exploring Parameter-Efficient Fine-Tuning for Mamba
An ecosystem of Transformer-based models has been established by building large models with extensive data. Parameter-efficient fine-tuning (PEFT) is a crucial technology for deploying these models to downstream tasks with minimal cost while achieving effective performance. Recently, Mamba, a State Space Model (SSM)-based model, has attracted attention
Yoshimura, Masakazu +2 more
openaire +2 more sources
This work discusses the use of blended channel materials in OECTs. It explores how mixing glycolated and alkoxylated polymers in various ratios offers a simpler and more efficient route to tuning OECT properties. The performance of the polymer blends is compared to the corresponding copolymers, demonstrating similar OECT characteristics, swelling ...
Lize Bynens +14 more
wiley +1 more source
Fine-tuning protein language models boosts predictions across diverse tasks
Prediction methods inputting embeddings from protein language models have reached or even surpassed state-of-the-art performance on many protein prediction tasks.
Robert Schmirler +2 more
doaj +1 more source
ANALYSIS OF WHISPER AUTOMATIC SPEECH RECOGNITION PERFORMANCE ON LOW RESOURCE LANGUAGE
Implementing Automatic Speech Recognition Technology in daily life could give convenience to its users. However, speeches that can be recognized accurately by the ASR model right now are in languages considered high resources, like English.
Riefkyanov Surya Adia Pratama +1 more
doaj +1 more source
This work explores Li‐substituted P2 layered oxides for Na‐ion batteries by crystallographic and electrochemical studies. The effect of lithium on superstructure orderings, on phase transitions during synthesis and electrochemical cycling and on the interplay of O‐ versus TM‐redox is revealed via various advanced techniques, including semi‐simultaneous
Mingfeng Xu +5 more
wiley +1 more source

