Results 71 to 80 of about 592,624 (324)
Banded transformer cores [PDF]
A banded transformer core formed by positioning a pair of mated, similar core halves on a supporting pedestal. The core halves are encircled with a strap, selectively applying tension whereby a compressive force is applied to the core edge for reducing ...
Mclyman, C. W. T.
core +1 more source
ERBIN limits epithelial cell plasticity via suppression of TGF‐β signaling
In breast and lung cancer patients, low ERBIN expression correlates with poor clinical outcomes. Here, we show that ERBIN inhibits TGF‐β‐induced epithelial‐to‐mesenchymal transition in NMuMG breast and A549 lung adenocarcinoma cell lines. ERBIN suppresses TGF‐β/SMAD signaling and reduces TGF‐β‐induced ERK phosphorylation.
Chao Li+3 more
wiley +1 more source
This paper presents a calculation scheme of inrush current of high voltage built-in high impedance transformer which takes into account the nonlinear changes of remanence and excitation inductance.
RONG Chunyan+5 more
doaj +1 more source
Improving Transformer-Based Neural Machine Translation with Prior Alignments
Transformer is a neural machine translation model which revolutionizes machine translation. Compared with traditional statistical machine translation models and other neural machine translation models, the recently proposed transformer model radically ...
Thien Nguyen+3 more
doaj +1 more source
Efficient pressure-transformer for fluids [PDF]
Fluid transformer utilizes fluid under pressure at one level to drive series of free pistons in positive displacement pump. Pump in turn delivers hydraulic fluid at different pressure level to a load.
Morando, J. A.
core +1 more source
Character-level Transformer-based Neural Machine Translation
Neural machine translation (NMT) is nowadays commonly applied at the subword level, using byte-pair encoding. A promising alternative approach focuses on character-level translation, which simplifies processing pipelines in NMT considerably.
Banar, Nikolay+2 more
core +1 more source
Knowing how proteases recognise preferred substrates facilitates matching proteases to applications. The S1′ pocket of protease EA1 directs cleavage to the N‐terminal side of hydrophobic residues, particularly leucine. The S1′ pocket of thermolysin differs from EA's at only one position (leucine in place of phenylalanine), which decreases cleavage ...
Grant R. Broomfield+3 more
wiley +1 more source
Language Modeling with Deep Transformers
We explore deep autoregressive Transformer models in language modeling for speech recognition. We focus on two aspects. First, we revisit Transformer model configurations specifically for language modeling. We show that well configured Transformer models
Irie, Kazuki+3 more
core +1 more source
Riesz transforms for Dunkl transform
In this paper we obtain the $L^p$-boundedness of Riesz transforms for Dunkl transform for all ...
Mohamed Sifi, Béchir Amri
openaire +3 more sources
In this work, we reveal how different enzyme binding configurations influence the fluorescence decay of NAD(P)H in live cells using time‐resolved anisotropy imaging and fluorescence lifetime imaging microscopy (FLIM). Mathematical modelling shows that the redox states of the NAD and NADP pools govern these configurations, shaping their fluorescence ...
Thomas S. Blacker+8 more
wiley +1 more source