Results 91 to 100 of about 677,995 (332)
Improving Transformer-Based Neural Machine Translation with Prior Alignments
Transformer is a neural machine translation model which revolutionizes machine translation. Compared with traditional statistical machine translation models and other neural machine translation models, the recently proposed transformer model radically ...
Thien Nguyen +3 more
doaj +1 more source
End-to-end Tracking with a Multi-query Transformer [PDF]
Bruno Korbar, Andrew Zisserman
openalex +1 more source
Monad transformers as monoid transformers
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
JASKELIOFF M, MOGGI, EUGENIO
openaire +1 more source
Structural biology of ferritin nanocages
Ferritin is a conserved iron‐storage protein that sequesters iron as a ferric mineral core within a nanocage, protecting cells from oxidative damage and maintaining iron homeostasis. This review discusses ferritin biology, structure, and function, and highlights recent cryo‐EM studies revealing mechanisms of ferritinophagy, cellular iron uptake, and ...
Eloise Mastrangelo, Flavio Di Pisa
wiley +1 more source
Abstractive Text Summarization for Resumes With Cutting Edge NLP Transformers and LSTM [PDF]
Öykü Berfin Mercan +3 more
openalex +1 more source
Study on thermal model for calculating transformer hot Spot temperature [PDF]
A power transformer is a static piece of apparatus with two or more windings which, by electromagnetic induction, transforms a system of alternating voltage and current into another system of voltage and current usually of different values and at same ...
Ramadan Dofan, Jamal Ali
core
Efficient Document Re-Ranking for Transformers by Precomputing Term Representations
Deep pretrained transformer networks are effective at various ranking tasks, such as question answering and ad-hoc document ranking. However, their computational expenses deem them cost-prohibitive in practice.
Frieder, Ophir +5 more
core +1 more source
Structural and biochemical characterisations show that the planar cell polarity (PCP) protein Inturned harbours a unique PDZ‐like domain that does not bind canonical PDZ‐binding motifs (PBMs) like that of another PCP protein Vangl2. In contrast, the apical‐basal polarity protein Scribble contains four PDZ domains that bind Vangl2, but one PDZ domain ...
Stephan Wilmes +4 more
wiley +1 more source
Unguarded Recursion on Coinductive Resumptions
We study a model of side-effecting processes obtained by starting from a monad modelling base effects and adjoining free operations using a cofree coalgebra construction; one thus arrives at what one may think of as types of non-wellfounded side ...
Goncharov, Sergey +3 more
core +1 more source
Segatron: Segment-Aware Transformer for Language Modeling and Understanding
Transformers are powerful for sequence modeling. Nearly all state-of-the-art language models and pre-trained language models are based on the Transformer architecture. However, it distinguishes sequential tokens only with the token position index.
Bai, He +7 more
core +2 more sources

