No Language Left Behind: Scaling Human-Centered Machine Translation [PDF]
Driven by the goal of eradicating language barriers on a global scale, machine translation has solidified itself as a key focus of artificial intelligence research today.
Nllb team +38 more
semanticscholar +1 more source
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension [PDF]
We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text.
M. Lewis +7 more
semanticscholar +1 more source
Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks [PDF]
Image-to-image translation is a class of vision and graphics problems where the goal is to learn the mapping between an input image and an output image using a training set of aligned image pairs. However, for many tasks, paired training data will not be
Jun-Yan Zhu +3 more
semanticscholar +1 more source
Plug-and-Play Diffusion Features for Text-Driven Image-to-Image Translation [PDF]
Large-scale text-to-image generative models have been a revolutionary breakthrough in the evolution of generative AI, synthesizing diverse images with highly complex visual concepts.
Narek Tumanyan +3 more
semanticscholar +1 more source
Image-to-Image Translation with Conditional Adversarial Networks [PDF]
We investigate conditional adversarial networks as a general-purpose solution to image-to-image translation problems. These networks not only learn the mapping from input image to output image, but also learn a loss function to train this mapping.
Phillip Isola +3 more
semanticscholar +1 more source
Neural Machine Translation of Rare Words with Subword Units [PDF]
Neural machine translation (NMT) models typically operate with a fixed vocabulary, but translation is an open-vocabulary problem. Previous work addresses the translation of out-of-vocabulary words by backing off to a dictionary.
Rico Sennrich +2 more
semanticscholar +1 more source
Effective Approaches to Attention-based Neural Machine Translation [PDF]
An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation.
Thang Luong +2 more
semanticscholar +1 more source
Zero-shot Image-to-Image Translation [PDF]
Large-scale text-to-image generative models have shown their remarkable ability to synthesize diverse, high-quality images. However, directly applying these models for real image editing remains challenging for two reasons. First, it is hard for users to
Gaurav Parmar +5 more
semanticscholar +1 more source
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation [PDF]
In this paper, we propose a novel neural network model called RNN Encoder‐ Decoder that consists of two recurrent neural networks (RNN). One RNN encodes a sequence of symbols into a fixedlength vector representation, and the other decodes the ...
Kyunghyun Cho +6 more
semanticscholar +1 more source
Multilingual Denoising Pre-training for Neural Machine Translation [PDF]
This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks.
Yinhan Liu +7 more
semanticscholar +1 more source

