Results 311 to 320 of about 4,624,323 (348)
Some of the next articles are maybe not open access.
Teaching Llama a New Language Through Cross-Lingual Knowledge Transfer
NAACL-HLTThis paper explores cost-efficient methods to adapt pretrained Large Language Models (LLMs) to new lower-resource languages, with a specific focus on Estonian. Leveraging the Llama 2 model, we investigate the impact of combining cross-lingual instruction-
Hele-Andra Kuulmets +3 more
semanticscholar +1 more source
AutoCAP: Towards Automatic Cross-lingual Alignment Planning for Zero-shot Chain-of-Thought
Annual Meeting of the Association for Computational LinguisticsCross-lingual chain-of-thought can effectively complete reasoning tasks across languages, which gains increasing attention. Recently, dominant approaches in the literature improve cross-lingual alignment capabilities by integrating reasoning knowledge ...
Yongheng Zhang +4 more
semanticscholar +1 more source
PreAlign: Boosting Cross-Lingual Transfer by Early Establishment of Multilingual Alignment
Conference on Empirical Methods in Natural Language ProcessingLarge language models demonstrate reasonable multilingual abilities, despite predominantly English-centric pretraining. However, the spontaneous multilingual alignment in these models is shown to be weak, leading to unsatisfactory cross-lingual transfer ...
Jiahuan Li +3 more
semanticscholar +1 more source
Cross-lingual query classification
Proceedings of the 2nd ACM workshop on Improving non english web searching, 2008The non-English Web is growing at breakneck speed, but available language processing tools are mostly English based. Taxonomies are a case in point: while there are plenty of commercial and non-commercial taxonomies for the English Web, taxonomies for other languages are either not available or of very limited quality. Given that building taxonomies in
Xuerui Wang +4 more
openaire +1 more source
NAACL-HLT
Many pretrained multilingual models exhibit cross-lingual transfer ability, which is often attributed to a learned language-neutral representation during pretraining.
Tianze Hua, Tian Yun, Ellie Pavlick
semanticscholar +1 more source
Many pretrained multilingual models exhibit cross-lingual transfer ability, which is often attributed to a learned language-neutral representation during pretraining.
Tianze Hua, Tian Yun, Ellie Pavlick
semanticscholar +1 more source
Proceedings of the 2019 Conference of the North, 2019
Kevin Blissett, Heng Ji
openaire +2 more sources
Kevin Blissett, Heng Ji
openaire +2 more sources
Cross-lingual Transfer of Reward Models in Multilingual Alignment
North American Chapter of the Association for Computational LinguisticsReinforcement learning with human feedback (RLHF) is shown to largely benefit from precise reward models (RMs). However, recent studies in reward modeling schemes are skewed towards English, limiting the applicability of RLHF in multilingual alignments ...
Jiwoo Hong +4 more
semanticscholar +1 more source
Cross-Lingual Propaganda Detection
2022 IEEE International Conference on Big Data (Big Data), 2022Wenshan Zhang, Xi Zhang
openaire +1 more source
2016
Entity typing is an essential task for constructing a knowledge base. However, many non-English knowledge bases fail to type their entities due to the absence of a reasonable local hierarchical taxonomy. Since constructing a widely accepted taxonomy is a hard problem, we propose to type these non-English entities with some widely accepted taxonomies in
Bo Xu +5 more
openaire +1 more source
Entity typing is an essential task for constructing a knowledge base. However, many non-English knowledge bases fail to type their entities due to the absence of a reasonable local hierarchical taxonomy. Since constructing a widely accepted taxonomy is a hard problem, we propose to type these non-English entities with some widely accepted taxonomies in
Bo Xu +5 more
openaire +1 more source
Cross-lingual Continual Learning
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2023Meryem Mâhamdi +2 more
openaire +1 more source

