Results 121 to 130 of about 947,558 (364)
Code Prediction by Feeding Trees to Transformers
We advance the state-of-the-art in the accuracy of code prediction (next token prediction) used in autocomplete systems. First, we report that using the recently proposed Transformer architecture even out-of-the-box outperforms previous neural and non ...
Chandra, Satish +3 more
core
ST-GAN: Spatial Transformer Generative Adversarial Networks for Image Compositing
We address the problem of finding realistic geometric corrections to a foreground object such that it appears natural when composited into a background image.
Lin, Chen-Hsuan +4 more
core +1 more source
Trainable Transformer in Transformer
Code base: https://github.com/abhishekpanigrahi1996 ...
Panigrahi, Abhishek +3 more
openaire +2 more sources
Bridging the gap: Multi‐stakeholder perspectives of molecular diagnostics in oncology
Although molecular diagnostics is transforming cancer care, implementing novel technologies remains challenging. This study identifies unmet needs and technology requirements through a two‐step stakeholder involvement. Liquid biopsies for monitoring applications and predictive biomarker testing emerge as key unmet needs. Technology requirements vary by
Jorine Arnouts +8 more
wiley +1 more source
Partial Discharge Analysis on Isolator of Transformer
Transformer has an important role in the world of electricity, especially high voltage. The selection of transformer isolation and routine maintenance should be in the same direction to maintain the reliability of transformer performance and work safety.
Antonov Bachtiar, Thomas Thomas
doaj +1 more source
Study of Transformer Lifetime Due to Loading Process on 20 KV Distribution Line
Power transformer is very important in electric power system due to its function to raise or lower the voltage according to its designation. On the power side, the power transformer serves to raise voltage to be transmitted to the transmission line.
A.A.N. Amrita +2 more
doaj +1 more source
Language Modeling with Deep Transformers
We explore deep autoregressive Transformer models in language modeling for speech recognition. We focus on two aspects. First, we revisit Transformer model configurations specifically for language modeling. We show that well configured Transformer models
Irie, Kazuki +3 more
core +1 more source
A‐to‐I editing of miRNAs, particularly miR‐200b‐3p, contributes to HGSOC progression by enhancing cancer cell proliferation, migration and 3D growth. The edited form is linked to poorer patient survival and the identification of novel molecular targets.
Magdalena Niemira +14 more
wiley +1 more source
Gegenbauer Transforms via the Radon Transform [PDF]
Use is made of the Radon transform on even dimensional spaces and Gegenbauer functions of the second kind to obtain a general Gegenbauer transform pair. In the two-dimensional limit the pair reduces to a Chebyshev transform pair.
openaire +2 more sources
This study indicates that Merkel cell carcinoma (MCC) does not originate from Merkel cells, and identifies gene, protein & cellular expression of immune‐linked and neuroendocrine markers in primary and metastatic Merkel cell carcinoma (MCC) tumor samples, linked to Merkel cell polyomavirus (MCPyV) status, with enrichment of B‐cell and other immune cell
Richie Jeremian +10 more
wiley +1 more source

