Results 21 to 30 of about 4,624,323 (348)
Unsupervised Cross-lingual Representation Learning at Scale [PDF]
This paper shows that pretraining multilingual language models at scale leads to significant performance gains for a wide range of cross-lingual transfer tasks.
Alexis Conneau +9 more
semanticscholar +1 more source
XNLI: Evaluating Cross-lingual Sentence Representations [PDF]
State-of-the-art natural language processing systems rely on supervision in the form of annotated data to learn competent models. These models are generally trained on data in a single language (usually English), and cannot be directly used beyond that ...
Alexis Conneau +6 more
semanticscholar +1 more source
Research of BERT Cross-Lingual Word Embedding Learning
With the development of multilingual information on the Internet, how to effectively represent the infor-mation contained in different language texts has become an important sub-task of natural language information processing.
WANG Yurong, LIN Min, LI Yanling
doaj +1 more source
Introduction: Rich neurovascular supply in the anterior mandible necessitates a preoperative radiological assessment of the lingual foramina/canal where cone beam computed tomography (CBCT) could produce promising results.
Sindhu Poovannan, T Sarumathi
doaj +1 more source
Cross-Corpus Multilingual Speech Emotion Recognition: Amharic vs. Other Languages
In a conventional speech emotion recognition (SER) task, a classifier for a given language is trained on a pre-existing dataset for that same language.
Ephrem Afele Retta +8 more
doaj +1 more source
Cross-Lingual Named Entity Recognition Based on Attention and Adversarial Training
Named entity recognition aims to extract entities with specific meaning from unstructured text. Currently, deep learning methods have been widely used for this task and have achieved remarkable results, but it is often difficult to achieve better results
Hao Wang +3 more
doaj +1 more source
Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and Beyond [PDF]
We introduce an architecture to learn joint multilingual sentence representations for 93 languages, belonging to more than 30 different families and written in 28 different scripts. Our system uses a single BiLSTM encoder with a shared byte-pair encoding
Mikel Artetxe, Holger Schwenk
semanticscholar +1 more source
On the Cross-lingual Transferability of Monolingual Representations [PDF]
State-of-the-art unsupervised multilingual models (e.g., multilingual BERT) have been shown to generalize in a zero-shot cross-lingual setting. This generalization ability has been attributed to the use of a shared subword vocabulary and joint training ...
Mikel Artetxe +2 more
semanticscholar +1 more source
This paper proposes a non-parallel cross-lingual voice conversion (CLVC) model that can mimic voice while continuously controlling speaker individuality on the basis of the variational autoencoder (VAE) and star generative adversarial network (StarGAN ...
Tuan Vu Ho, Masato Akagi
doaj +1 more source
Unsupervised Cross-Lingual Representation Learning [PDF]
In this tutorial, we provide a comprehensive survey of the exciting recent work on cutting-edge weakly-supervised and unsupervised cross-lingual word representations. After providing a brief history of supervised cross-lingual word representations, we focus on: 1) how to induce weakly-supervised and unsupervised cross-lingual word representations in ...
Ruder, S, Søgaard, A, Vulic, I
openaire +2 more sources

