Results 1 to 10 of about 4,721,082 (244)
A model for implementing international networking within the pandemic conditions [PDF]
The introduction of a network form of participants' interaction in the educational process, which gives a number of advantages in the implementation of educational programs, received a lively positive response from both universities and students.
Kotsyubinskaya Liubov Vyacheslavovna+2 more
doaj +1 more source
Zero-shot Image-to-Image Translation [PDF]
Large-scale text-to-image generative models have shown their remarkable ability to synthesize diverse, high-quality images. However, directly applying these models for real image editing remains challenging for two reasons. First, it is hard for users to
Gaurav Parmar+5 more
semanticscholar +1 more source
when we were astronauts in trainingCHAR(13) + CHAR(10) we spun around at a breakneck speedCHAR(13) + CHAR(10) in a shining sphere in the darkCHAR(13) + CHAR(10) until our eyes ended upCHAR(13) + CHAR(10) on the other side of everything when we were ...
Ivica Prtenjača+1 more
doaj +1 more source
Implicit Cross-Lingual Word Embedding Alignment for Reference-Free Machine Translation Evaluation
As we know, cross-lingual word embedding alignment is critically important for reference-free machine translation evaluation, where source texts are directly compared with system translations.
Min Zhang+7 more
doaj +1 more source
How Good Are GPT Models at Machine Translation? A Comprehensive Evaluation [PDF]
Generative Pre-trained Transformer (GPT) models have shown remarkable capabilities for natural language generation, but their performance for machine translation has not been thoroughly investigated.
Amr Hendy+8 more
semanticscholar +1 more source
No Language Left Behind: Scaling Human-Centered Machine Translation [PDF]
Driven by the goal of eradicating language barriers on a global scale, machine translation has solidified itself as a key focus of artificial intelligence research today.
Nllb team+38 more
semanticscholar +1 more source
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension [PDF]
We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text.
M. Lewis+7 more
semanticscholar +1 more source
Large Language Models Are State-of-the-Art Evaluators of Translation Quality [PDF]
We describe GEMBA, a GPT-based metric for assessment of translation quality, which works both with a reference translation and without. In our evaluation, we focus on zero-shot prompting, comparing four prompt variants in two modes, based on the ...
Tom Kocmi, C. Federmann
semanticscholar +1 more source
Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks [PDF]
Image-to-image translation is a class of vision and graphics problems where the goal is to learn the mapping between an input image and an output image using a training set of aligned image pairs. However, for many tasks, paired training data will not be
Jun-Yan Zhu+3 more
semanticscholar +1 more source
The use of collaborative health research approaches, such as integrated knowledge translation (IKT), was challenged during the COVID-19 pandemic due to physical distancing measures and transition to virtual platforms. As IKT trainees (i.e.
Priscilla Medeiros+12 more
doaj +1 more source