Results 41 to 50 of about 227,550 (285)
DARTS-ASR: Differentiable Architecture Search for Multilingual Speech Recognition and Adaptation
In previous works, only parameter weights of ASR models are optimized under fixed-topology architecture. However, the design of successful model architecture has always relied on human experience and intuition.
Chen, Yi-Chen +3 more
core +1 more source
SLoRA: Federated Parameter Efficient Fine-Tuning of Language Models
Transfer learning via fine-tuning pre-trained transformer models has gained significant success in delivering state-of-the-art results across various NLP tasks. In the absence of centralized data, Federated Learning (FL) can benefit from distributed and private data of the FL edge clients for fine-tuning.
Babakniya, Sara +6 more
openaire +2 more sources
Tandem VHH targeting distinct EGFR epitopes were engineered into a monovalent bispecific antibody (7D12‐EGA1‐Fc) with more potent ADCC without increasing affinity to EGFR. Structural modeling of 7D12‐EGA1‐Fc showed cross‐linking of separate EGFR domains to enhance CD16a engagement on NK cells.
Yuqiang Xu +5 more
wiley +1 more source
Scalable Compression of Deep Neural Networks
Deep neural networks generally involve some layers with mil- lions of parameters, making them difficult to be deployed and updated on devices with limited resources such as mobile phones and other smart embedded systems.
Chen W. +7 more
core +1 more source
GIST: Improving Parameter Efficient Fine Tuning via Knowledge Interaction
17pages, 8 figures, 22 tables, Work in ...
Ruan, Jiacheng +6 more
openaire +2 more sources
CRISPRI‐mediated gene silencing and phenotypic exploration in nontuberculous mycobacteria. In this Research Protocol, we describe approaches to control, monitor, and quantitatively assess CRISPRI‐mediated gene silencing in M. smegmatis and M. abscessus model organisms.
Vanessa Point +7 more
wiley +1 more source
Parameter-Efficient Sparsity for Large Language Models Fine-Tuning
With the dramatically increased number of parameters in language models, sparsity methods have received ever-increasing research focus to compress and accelerate the models. While most research focuses on how to accurately retain appropriate weights while maintaining the performance of the compressed model, there are challenges in the computational ...
Li, Yuchao +6 more
openaire +2 more sources
ABSTRACT Objective Peripheral neuropathies contribute to patient disability but may be diagnosed late or missed altogether due to late referral, limitation of current diagnostic methods and lack of specialized testing facilities. To address this clinical gap, we developed NeuropathAI, an interpretable deep learning–based multiclass classification ...
Chaima Ben Rabah +7 more
wiley +1 more source
Large language models for PHM: a review of optimization techniques and applications
The rapid advancement of Large Language Models (LLMs) has created unprecedented opportunities for industrial automation, process optimization, and decision support systems.
Tingyi Yu +5 more
doaj +1 more source
Improving Tire Pattern Recognition Using Parameter-Efficient Fine-Tuning Techniques
Tire-tread classification plays a key role in forensic investigation and public safety. This work introduces a robust, efficient recognition system that integrates Discrete Wavelet Transform (DWT) with Weighted Local Gray-Level on Robust Local Binary ...
Parkpoom Chaisiriprasert +1 more
doaj +1 more source

