Results 291 to 300 of about 4,624,323 (348)
Some of the next articles are maybe not open access.

Cross-Lingual Consistency of Factual Knowledge in Multilingual Language Models

Conference on Empirical Methods in Natural Language Processing, 2023
Multilingual large-scale Pretrained Language Models (PLMs) have been shown to store considerable amounts of factual knowledge, but large variations are observed across languages.
Jirui Qi, R. Fern'andez, Arianna Bisazza
semanticscholar   +1 more source

Lost in Multilinguality: Dissecting Cross-lingual Factual Inconsistency in Transformer Language Models

Annual Meeting of the Association for Computational Linguistics
Multilingual language models (MLMs) store factual knowledge across languages but often struggle to provide consistent responses to semantically equivalent prompts in different languages.
Mingyang Wang   +6 more
semanticscholar   +1 more source

Middle-Layer Representation Alignment for Cross-Lingual Transfer in Fine-Tuned LLMs

Annual Meeting of the Association for Computational Linguistics
While large language models demonstrate remarkable capabilities at task-specific applications through fine-tuning, extending these benefits across diverse languages is essential for broad accessibility.
Danni Liu, Jan Niehues
semanticscholar   +1 more source

Cross-Lingual Sentiment Relation Capturing for Cross-Lingual Sentiment Analysis

2017
Sentiment connection is the basis of cross-lingual sentiment analysis (CSLA) solutions. Most of existing work mainly focus on general semantic connection that the misleading information caused by non-sentimental semantics probably lead to relatively low efficiency.
Qiang Chen   +5 more
openaire   +1 more source

Better to Ask in English: Cross-Lingual Evaluation of Large Language Models for Healthcare Queries

The Web Conference, 2023
Large language models (LLMs) are transforming the ways the general public accesses and consumes information. Their influence is particularly pronounced in pivotal sectors like healthcare, where lay individuals are increasingly appropriating LLMs as ...
Yiqiao Jin   +5 more
semanticscholar   +1 more source

Continual Pre-Training for Cross-Lingual LLM Adaptation: Enhancing Japanese Language Capabilities

arXiv.org
Cross-lingual continual pre-training of large language models (LLMs) initially trained on English corpus allows us to leverage the vast amount of English language resources and reduce the pre-training cost.
Kazuki Fujii   +9 more
semanticscholar   +1 more source

Cross-Lingual Word Embeddings

2019
The majority of natural language processing (NLP) is English language processing, and while there is good language technology support for (standard varieties of) English, support for Albanian, Burmese, or Cebuano-and most other languages-remains limited.
Søgaard, Anders   +3 more
openaire   +2 more sources

Cross-lingual Search

2017
The project Cross-lingual Search should make finding information across language barriers possible. There is an ongoing debate on whether scientific and professional communication should be based on a lingua franca like English or whether national languages should also play a role.
openaire   +1 more source

Cross-Lingual Information Retrieval

2004
Structural and semantic interoperability have been the focus of digital library research in the early 1990s. Many research works have been done on searching and retrieving objects across variations in protocols, formats, and disciplines. As the World Wide Web has become more popular in the last ten years, information is available in multiple languages ...
Christopher Yang, Kar W. Li
openaire   +1 more source

Evaluating Cross-Lingual Equating

International Journal of Testing, 2003
Adapting educational tests from 1 language to others requires equating across the different language versions to be able to compare examinees from different language groups. Such equating is usually based on translated items considered to have similar content and psychometric characteristics in both source and target languages.
Joel Rapp, Avi Allalouf
openaire   +1 more source

Home - About - Disclaimer - Privacy