Results 41 to 50 of about 226,203 (262)
Disordered but rhythmic—the role of intrinsic protein disorder in eukaryotic circadian timing
Unstructured domains known as intrinsically disordered regions (IDRs) are present in nearly every part of the eukaryotic core circadian oscillator. IDRs enable many diverse inter‐ and intramolecular interactions that support clock function. IDR conformations are highly tunable by post‐translational modifications and environmental conditions, which ...
Emery T. Usher, Jacqueline F. Pelham
wiley +1 more source
IntroductionGenerating physician letters is a time-consuming task in daily clinical practice.MethodsThis study investigates local fine-tuning of large language models (LLMs), specifically LLaMA models, for physician letter generation in a privacy ...
Yihao Hou +40 more
doaj +1 more source
Time after time – circadian clocks through the lens of oscillator theory
Oscillator theory bridges physics and circadian biology. Damped oscillators require external drivers, while limit cycles emerge from delayed feedback and nonlinearities. Coupling enables tissue‐level coherence, and entrainment aligns internal clocks with environmental cues.
Marta del Olmo +2 more
wiley +1 more source
SLoRA: Federated Parameter Efficient Fine-Tuning of Language Models
Transfer learning via fine-tuning pre-trained transformer models has gained significant success in delivering state-of-the-art results across various NLP tasks. In the absence of centralized data, Federated Learning (FL) can benefit from distributed and private data of the FL edge clients for fine-tuning.
Babakniya, Sara +6 more
openaire +2 more sources
We reconstituted Synechocystis glycogen synthesis in vitro from purified enzymes and showed that two GlgA isoenzymes produce glycogen with different architectures: GlgA1 yields denser, highly branched glycogen, whereas GlgA2 synthesizes longer, less‐branched chains.
Kenric Lee +3 more
wiley +1 more source
PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning
This paper presents a method for adding multiple tasks to a single deep neural network while avoiding catastrophic forgetting. Inspired by network pruning techniques, we exploit redundancies in large deep networks to free up parameters that can then be ...
Lazebnik, Svetlana, Mallya, Arun
core +1 more source
GIST: Improving Parameter Efficient Fine Tuning via Knowledge Interaction
17pages, 8 figures, 22 tables, Work in ...
Ruan, Jiacheng +6 more
openaire +2 more sources
Potential therapeutic targeting of BKCa channels in glioblastoma treatment
This review summarizes current insights into the role of BKCa and mitoBKCa channels in glioblastoma biology, their potential classification as oncochannels, and the emerging pharmacological strategies targeting these channels, emphasizing the translational challenges in developing BKCa‐directed therapies for glioblastoma treatment.
Kamila Maliszewska‐Olejniczak +4 more
wiley +1 more source
Improving Tire Pattern Recognition Using Parameter-Efficient Fine-Tuning Techniques
Tire-tread classification plays a key role in forensic investigation and public safety. This work introduces a robust, efficient recognition system that integrates Discrete Wavelet Transform (DWT) with Weighted Local Gray-Level on Robust Local Binary ...
Parkpoom Chaisiriprasert +1 more
doaj +1 more source
Scalable Compression of Deep Neural Networks
Deep neural networks generally involve some layers with mil- lions of parameters, making them difficult to be deployed and updated on devices with limited resources such as mobile phones and other smart embedded systems.
Chen W. +7 more
core +1 more source

