A Multi-Layer Network for Aspect-Based Cross-Lingual Sentiment Classification
In the recent era, the advancement of communication technologies provides a valuable interaction source between people of different regions. Nowadays, many organizations adopt the latest approaches, i.e., sentiment analysis and aspect-oriented sentiment ...
Kalim Sattar +5 more
doaj +1 more source
Exploring the Data Efficiency of Cross-Lingual Post-Training in Pretrained Language Models
Language model pretraining is an effective method for improving the performance of downstream natural language processing tasks. Even though language modeling is unsupervised and thus collecting data for it is relatively less expensive, it is still a ...
Chanhee Lee +5 more
doaj +1 more source
Chinese-Vietnamese Cross-Lingual Word-Embedding Combined with Word Cluster Constraints [PDF]
To solve for the poor alignment effect of the traditional cross-lingual word-embedding method in low-resource languages such as Chinese-Vietnamese, this paper proposes a Chinese-Vietnamese cross-lingual word embedding method with word cluster alignment ...
WU Zhaoyuan, YU Zhengtao, HUANG Yuxin
doaj +1 more source
InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training [PDF]
In this work, we present an information-theoretic framework that formulates cross-lingual language model pre-training as maximizing mutual information between multilingual-multi-granularity texts.
Zewen Chi +9 more
semanticscholar +1 more source
Cross-lingual Adaptation Using Universal Dependencies [PDF]
We describe a cross-lingual adaptation method based on syntactic parse trees obtained from the Universal Dependencies (UD), which are consistent across languages, to develop classifiers in low-resource languages. The idea of UD parsing is to capture similarities as well as idiosyncrasies among typologically different languages. In this article, we show
Nasrin Taghizadeh, Heshaam Faili
openaire +2 more sources
Towards a Common Understanding of Contributing Factors for Cross-Lingual Transfer in Multilingual Language Models: A Review [PDF]
In recent years, pre-trained Multilingual Language Models (MLLMs) have shown a strong ability to transfer knowledge across different languages. However, given that the aspiration for such an ability has not been explicitly incorporated in the design of ...
Fred Philippy +2 more
semanticscholar +1 more source
Revisiting Machine Translation for Cross-lingual Classification [PDF]
Machine Translation (MT) has been widely used for cross-lingual classification, either by translating the test set into English and running inference with a monolingual model (translate-test), or translating the training set into the target languages and
Mikel Artetxe +4 more
semanticscholar +1 more source
Zero-shot cross-lingual transfer language selection using linguistic similarity [PDF]
We study the selection of transfer languages for different Natural Language Processing tasks, specifically sentiment analysis, named entity recognition and dependency parsing.
J. Eronen, M. Ptaszynski, Fumito Masui
semanticscholar +1 more source
Empowering Cross-lingual Abilities of Instruction-tuned Large Language Models by Translation-following demonstrations [PDF]
The language ability of Large Language Models (LLMs) is often unbalanced towards English because of the imbalance in the distribution of the pre-training data. This disparity is demanded in further fine-tuning and affecting the cross-lingual abilities of
Leonardo Ranaldi +2 more
semanticscholar +1 more source
Cross-Lingual Phrase Retrieval
Cross-lingual retrieval aims to retrieve relevant text across languages. Current methods typically achieve cross-lingual retrieval by learning language-agnostic text representations in word or sentence level. However, how to learn phrase representations for cross-lingual phrase retrieval is still an open problem.
Zheng, Heqi +7 more
openaire +2 more sources

