Results 291 to 300 of about 339,627 (329)
Some of the next articles are maybe not open access.

Catastrophic Forgetting in LLMs: A Comparative Analysis Across Language Tasks

arXiv.org
Large Language Models (LLMs) have significantly advanced Natural Language Processing (NLP), particularly in Natural Language Understanding (NLU) tasks.
Naimul Haque
semanticscholar   +1 more source

Mitigating Catastrophic Forgetting in Language Transfer via Model Merging

Conference on Empirical Methods in Natural Language Processing
As open-weight large language models (LLMs) achieve ever more impressive performances across a wide range of tasks in English, practitioners aim to adapt these models to different languages.
Anton Alexandrov   +5 more
semanticscholar   +1 more source

Solving the Catastrophic Forgetting Problem in Generalized Category Discovery

Computer Vision and Pattern Recognition
Generalized Category Discovery (GCD) aims to identify a mix of known and novel categories within unlabeled data sets, providing a more realistic setting for image recognition.
Xinzi Cao   +7 more
semanticscholar   +1 more source

CURLoRA: Stable LLM Continual Fine-Tuning and Catastrophic Forgetting Mitigation

arXiv.org
This paper introduces CURLoRA, a novel approach to fine-tuning large language models (LLMs) that leverages CUR matrix decomposition in the context of Low-Rank Adaptation (LoRA).
Muhammad Fawi
semanticscholar   +1 more source

Overcoming Catastrophic Forgetting for Multi-Label Class-Incremental Learning

IEEE Workshop/Winter Conference on Applications of Computer Vision
Despite the recent progress of class-incremental learning (CIL) methods, their capabilities in real-world scenarios such as multi-label settings remain unexplored.
Xiang Song   +5 more
semanticscholar   +1 more source

Overcoming Spatial-Temporal Catastrophic Forgetting for Federated Class-Incremental Learning

ACM Multimedia
This paper delves into federated class-incremental learning (FCiL), where new classes appear continually or even privately to local clients. However, existing FCiL methods suffer from the problem of spatial-temporal catastrophic forgetting, i.e ...
Hao Yu   +6 more
semanticscholar   +1 more source

CF-KAN: Kolmogorov-Arnold Network-based Collaborative Filtering to Mitigate Catastrophic Forgetting in Recommender Systems

arXiv.org
Collaborative filtering (CF) remains essential in recommender systems, leveraging user--item interactions to provide personalized recommendations. Meanwhile, a number of CF techniques have evolved into sophisticated model architectures based on multi ...
Jin-Duk Park   +2 more
semanticscholar   +1 more source

DoFIT: Domain-aware Federated Instruction Tuning with Alleviated Catastrophic Forgetting

Neural Information Processing Systems
Federated Instruction Tuning (FIT) advances collaborative training on decentralized data, crucially enhancing model’s capability and safeguarding data privacy.
Binqian Xu   +6 more
semanticscholar   +1 more source

Manifold learning to address catastrophic forgetting

Proceedings of the Twelfth Indian Conference on Computer Vision, Graphics and Image Processing, 2021
Prathyusha Akundi, Jayanthi Sivaswamy
openaire   +1 more source

Overcoming Catastrophic Forgetting with Self-adaptive Identifiers

2018
Catastrophic forgetting is a tough issue when the agent faces the sequential multi-task learning scenario without storing previous task information. It gradually becomes an obstacle to achieve artificial general intelligence which is generally believed to behave like a human with continuous learning capability.
Fangzhou Xiong, Zhiyong Liu, Xu Yang
openaire   +1 more source

Home - About - Disclaimer - Privacy