Results 61 to 70 of about 247,556 (284)
Design Improvement of Permanent Magnet Motor Using Single- and Multi-Objective Approaches
Optimisation, or optimal design, has become a fundamental aspect of engineering across various domains, including power devices, power systems, and industrial systems.
Cvetkovski Goga, Petkovska Lidija
doaj +1 more source
Adaptive Attention Span in Transformers
We propose a novel self-attention mechanism that can learn its optimal attention span. This allows us to extend significantly the maximum context size used in Transformer, while maintaining control over their memory footprint and computational time.
Bojanowski, Piotr +3 more
core +1 more source
The COVID-19 pandemic placed the field of vaccinology squarely at the center of global consciousness, emphasizing the vital role of vaccines as transformative public health tools. The impact of vaccines was recently acknowledged by the award of the 2023 Nobel Prize in Physiology or Medicine to Katalin Kariko and Drew Weissman for their seminal ...
Rino Rappuoli +2 more
openaire +3 more sources
Appell Transformation and Canonical Transforms [PDF]
The interpretation of the optical Appell transformation, as previously elaborated in relation to the free-space paraxial propagation under both a rectangular and a circular cylindrical symmetry, is reviewed. Then, the caloric Appell transformation, well known in the theory of heat equation, is shown to be amenable for a similar interpretation involving
openaire +5 more sources
This perspective highlights emerging insights into how the circadian transcription factor CLOCK:BMAL1 regulates chromatin architecture, cooperates with other transcription factors, and coordinates enhancer dynamics. We propose an updated framework for how circadian transcription factors operate within dynamic and multifactorial chromatin landscapes ...
Xinyu Y. Nie, Jerome S. Menet
wiley +1 more source
This paper investigates the application of heuristic optimization techniques for weight optimization in eco-design distribution transformers, with a focus on hermetically sealed transformers.
Mohammad Hassan Hashemi, Ulas Kilic
doaj +1 more source
Language Modeling with Deep Transformers
We explore deep autoregressive Transformer models in language modeling for speech recognition. We focus on two aspects. First, we revisit Transformer model configurations specifically for language modeling. We show that well configured Transformer models
Irie, Kazuki +3 more
core +1 more source
A Solid State Transformer model for power flow calculations [PDF]
This paper presents the implementation of a Solid State Transformer (SST) model in OpenDSS. The goal is to develop a SST model that could be useful for assessing the impact that the replacement of the conventional iron-and-copper transformer with the SST
Guerra Sánchez, Luis Gerardo +1 more
core +2 more sources
Real‐time assay of ribonucleotide reductase activity with a fluorescent RNA aptamer
Ribonucleotide reductases (RNR) synthesize DNA building blocks de novo, making them crucial in DNA replication and drug targeting. FLARE introduces the first single‐tube real‐time coupled RNR assay, which enables isothermal tracking of RNR activity at nanomolar enzyme levels and allows the reconstruction of allosteric regulatory patterns and rapid ...
Jacopo De Capitani +4 more
wiley +1 more source

