Results 51 to 60 of about 4,624,323 (348)

A Multi-Layer Network for Aspect-Based Cross-Lingual Sentiment Classification

open access: yesIEEE Access, 2021
In the recent era, the advancement of communication technologies provides a valuable interaction source between people of different regions. Nowadays, many organizations adopt the latest approaches, i.e., sentiment analysis and aspect-oriented sentiment ...
Kalim Sattar   +5 more
doaj   +1 more source

Exploring the Data Efficiency of Cross-Lingual Post-Training in Pretrained Language Models

open access: yesApplied Sciences, 2021
Language model pretraining is an effective method for improving the performance of downstream natural language processing tasks. Even though language modeling is unsupervised and thus collecting data for it is relatively less expensive, it is still a ...
Chanhee Lee   +5 more
doaj   +1 more source

Chinese-Vietnamese Cross-Lingual Word-Embedding Combined with Word Cluster Constraints [PDF]

open access: yesJisuanji gongcheng, 2023
To solve for the poor alignment effect of the traditional cross-lingual word-embedding method in low-resource languages such as Chinese-Vietnamese, this paper proposes a Chinese-Vietnamese cross-lingual word embedding method with word cluster alignment ...
WU Zhaoyuan, YU Zhengtao, HUANG Yuxin
doaj   +1 more source

InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training [PDF]

open access: yesNorth American Chapter of the Association for Computational Linguistics, 2020
In this work, we present an information-theoretic framework that formulates cross-lingual language model pre-training as maximizing mutual information between multilingual-multi-granularity texts.
Zewen Chi   +9 more
semanticscholar   +1 more source

Cross-lingual Adaptation Using Universal Dependencies [PDF]

open access: yesACM Transactions on Asian and Low-Resource Language Information Processing, 2021
We describe a cross-lingual adaptation method based on syntactic parse trees obtained from the Universal Dependencies (UD), which are consistent across languages, to develop classifiers in low-resource languages. The idea of UD parsing is to capture similarities as well as idiosyncrasies among typologically different languages. In this article, we show
Nasrin Taghizadeh, Heshaam Faili
openaire   +2 more sources

Towards a Common Understanding of Contributing Factors for Cross-Lingual Transfer in Multilingual Language Models: A Review [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2023
In recent years, pre-trained Multilingual Language Models (MLLMs) have shown a strong ability to transfer knowledge across different languages. However, given that the aspiration for such an ability has not been explicitly incorporated in the design of ...
Fred Philippy   +2 more
semanticscholar   +1 more source

Revisiting Machine Translation for Cross-lingual Classification [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2023
Machine Translation (MT) has been widely used for cross-lingual classification, either by translating the test set into English and running inference with a monolingual model (translate-test), or translating the training set into the target languages and
Mikel Artetxe   +4 more
semanticscholar   +1 more source

Zero-shot cross-lingual transfer language selection using linguistic similarity [PDF]

open access: yesInformation Processing & Management, 2023
We study the selection of transfer languages for different Natural Language Processing tasks, specifically sentiment analysis, named entity recognition and dependency parsing.
J. Eronen, M. Ptaszynski, Fumito Masui
semanticscholar   +1 more source

Empowering Cross-lingual Abilities of Instruction-tuned Large Language Models by Translation-following demonstrations [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2023
The language ability of Large Language Models (LLMs) is often unbalanced towards English because of the imbalance in the distribution of the pre-training data. This disparity is demanded in further fine-tuning and affecting the cross-lingual abilities of
Leonardo Ranaldi   +2 more
semanticscholar   +1 more source

Cross-Lingual Phrase Retrieval

open access: yesProceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2022
Cross-lingual retrieval aims to retrieve relevant text across languages. Current methods typically achieve cross-lingual retrieval by learning language-agnostic text representations in word or sentence level. However, how to learn phrase representations for cross-lingual phrase retrieval is still an open problem.
Zheng, Heqi   +7 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy