Results 11 to 20 of about 5,227,080 (312)

No Language Left Behind: Scaling Human-Centered Machine Translation [PDF]

open access: yesarXiv.org, 2022
Driven by the goal of eradicating language barriers on a global scale, machine translation has solidified itself as a key focus of artificial intelligence research today.
Nllb team   +38 more
semanticscholar   +1 more source

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2019
We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text.
M. Lewis   +7 more
semanticscholar   +1 more source

Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks [PDF]

open access: yesIEEE International Conference on Computer Vision, 2017
Image-to-image translation is a class of vision and graphics problems where the goal is to learn the mapping between an input image and an output image using a training set of aligned image pairs. However, for many tasks, paired training data will not be
Jun-Yan Zhu   +3 more
semanticscholar   +1 more source

Plug-and-Play Diffusion Features for Text-Driven Image-to-Image Translation [PDF]

open access: yesComputer Vision and Pattern Recognition, 2022
Large-scale text-to-image generative models have been a revolutionary breakthrough in the evolution of generative AI, synthesizing diverse images with highly complex visual concepts.
Narek Tumanyan   +3 more
semanticscholar   +1 more source

Image-to-Image Translation with Conditional Adversarial Networks [PDF]

open access: yesComputer Vision and Pattern Recognition, 2016
We investigate conditional adversarial networks as a general-purpose solution to image-to-image translation problems. These networks not only learn the mapping from input image to output image, but also learn a loss function to train this mapping.
Phillip Isola   +3 more
semanticscholar   +1 more source

Neural Machine Translation of Rare Words with Subword Units [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2015
Neural machine translation (NMT) models typically operate with a fixed vocabulary, but translation is an open-vocabulary problem. Previous work addresses the translation of out-of-vocabulary words by backing off to a dictionary.
Rico Sennrich   +2 more
semanticscholar   +1 more source

Effective Approaches to Attention-based Neural Machine Translation [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2015
An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation.
Thang Luong   +2 more
semanticscholar   +1 more source

Zero-shot Image-to-Image Translation [PDF]

open access: yesInternational Conference on Computer Graphics and Interactive Techniques, 2023
Large-scale text-to-image generative models have shown their remarkable ability to synthesize diverse, high-quality images. However, directly applying these models for real image editing remains challenging for two reasons. First, it is hard for users to
Gaurav Parmar   +5 more
semanticscholar   +1 more source

Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2014
In this paper, we propose a novel neural network model called RNN Encoder‐ Decoder that consists of two recurrent neural networks (RNN). One RNN encodes a sequence of symbols into a fixedlength vector representation, and the other decodes the ...
Kyunghyun Cho   +6 more
semanticscholar   +1 more source

Multilingual Denoising Pre-training for Neural Machine Translation [PDF]

open access: yesTransactions of the Association for Computational Linguistics, 2020
This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks.
Yinhan Liu   +7 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy