Results 21 to 30 of about 4,624,323 (348)

Unsupervised Cross-lingual Representation Learning at Scale [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2019
This paper shows that pretraining multilingual language models at scale leads to significant performance gains for a wide range of cross-lingual transfer tasks.
Alexis Conneau   +9 more
semanticscholar   +1 more source

XNLI: Evaluating Cross-lingual Sentence Representations [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2018
State-of-the-art natural language processing systems rely on supervision in the form of annotated data to learn competent models. These models are generally trained on data in a single language (usually English), and cannot be directly used beyond that ...
Alexis Conneau   +6 more
semanticscholar   +1 more source

Research of BERT Cross-Lingual Word Embedding Learning

open access: yesJisuanji kexue yu tansuo, 2021
With the development of multilingual information on the Internet, how to effectively represent the infor-mation contained in different language texts has become an important sub-task of natural language information processing.
WANG Yurong, LIN Min, LI Yanling
doaj   +1 more source

Prevalence and anatomic variations of lingual foramina and lingual canal in anterior mandible using cone beam computed tomography – A cross-sectional study

open access: yesJournal of Indian Academy of Oral Medicine and Radiology, 2022
Introduction: Rich neurovascular supply in the anterior mandible necessitates a preoperative radiological assessment of the lingual foramina/canal where cone beam computed tomography (CBCT) could produce promising results.
Sindhu Poovannan, T Sarumathi
doaj   +1 more source

Cross-Corpus Multilingual Speech Emotion Recognition: Amharic vs. Other Languages

open access: yesApplied Sciences, 2023
In a conventional speech emotion recognition (SER) task, a classifier for a given language is trained on a pre-existing dataset for that same language.
Ephrem Afele Retta   +8 more
doaj   +1 more source

Cross-Lingual Named Entity Recognition Based on Attention and Adversarial Training

open access: yesApplied Sciences, 2023
Named entity recognition aims to extract entities with specific meaning from unstructured text. Currently, deep learning methods have been widely used for this task and have achieved remarkable results, but it is often difficult to achieve better results
Hao Wang   +3 more
doaj   +1 more source

Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and Beyond [PDF]

open access: yesTransactions of the Association for Computational Linguistics, 2018
We introduce an architecture to learn joint multilingual sentence representations for 93 languages, belonging to more than 30 different families and written in 28 different scripts. Our system uses a single BiLSTM encoder with a shared byte-pair encoding
Mikel Artetxe, Holger Schwenk
semanticscholar   +1 more source

On the Cross-lingual Transferability of Monolingual Representations [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2019
State-of-the-art unsupervised multilingual models (e.g., multilingual BERT) have been shown to generalize in a zero-shot cross-lingual setting. This generalization ability has been attributed to the use of a shared subword vocabulary and joint training ...
Mikel Artetxe   +2 more
semanticscholar   +1 more source

Cross-Lingual Voice Conversion With Controllable Speaker Individuality Using Variational Autoencoder and Star Generative Adversarial Network

open access: yesIEEE Access, 2021
This paper proposes a non-parallel cross-lingual voice conversion (CLVC) model that can mimic voice while continuously controlling speaker individuality on the basis of the variational autoencoder (VAE) and star generative adversarial network (StarGAN ...
Tuan Vu Ho, Masato Akagi
doaj   +1 more source

Unsupervised Cross-Lingual Representation Learning [PDF]

open access: yesProceedings of the 57th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts, 2019
In this tutorial, we provide a comprehensive survey of the exciting recent work on cutting-edge weakly-supervised and unsupervised cross-lingual word representations. After providing a brief history of supervised cross-lingual word representations, we focus on: 1) how to induce weakly-supervised and unsupervised cross-lingual word representations in ...
Ruder, S, Søgaard, A, Vulic, I
openaire   +2 more sources

Home - About - Disclaimer - Privacy