Results 271 to 280 of about 339,627 (329)
CAREC: Continual Wireless Action Recognition with Expansion-Compression Coordination. [PDF]
Zhang T, Fu Q, Ding H, Wang G, Wang F.
europepmc +1 more source
Uncertainty aware domain incremental learning for cross domain depression detection. [PDF]
Lifelo Z, Ding J, Ning H, Dhelim S.
europepmc +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Catastrophic forgetting in connectionist networks
Trends in Cognitive Sciences, 1999All natural cognitive systems, and, in particular, our own, gradually forget previously learned information. Plausible models of human cognition should therefore exhibit similar patterns of gradual forgetting of old information as new information is acquired.
R. French
semanticscholar +5 more sources
An Empirical Study of Catastrophic Forgetting in Large Language Models During Continual Fine-Tuning
IEEE Transactions on Audio, Speech, and Language Processing, 2023Catastrophic forgetting (CF) is a phenomenon that occurs in machine learning when a model forgets previously learned information while acquiring new knowledge for achieving satisfactory performance in downstream tasks.
Yun Luo +5 more
semanticscholar +1 more source
Avoiding Catastrophic Forgetting
Trends in Cognitive Sciences, 2017Humans regularly perform new learning without losing memory for previous information, but neural network models suffer from the phenomenon of catastrophic forgetting in which new learning impairs prior function. A recent article presents an algorithm that spares learning at synapses important for previously learned function, reducing catastrophic ...
openaire +2 more sources
Catastrophic Forgetting in Deep Learning: A Comprehensive Taxonomy
Journal of the Brazilian Computer Society, 2023Deep Learning models have achieved remarkable performance in tasks such as image classification or generation, often surpassing human accuracy. However, they can struggle to learn new tasks and update their knowledge without access to previous data ...
Everton L. Aleixo +3 more
semanticscholar +1 more source
Overcoming Catastrophic Forgetting in Continual Learning by Exploring Eigenvalues of Hessian Matrix
IEEE Transactions on Neural Networks and Learning Systems, 2023Neural networks tend to suffer performance deterioration on previous tasks when they are applied to multiple tasks sequentially without access to previous data.
Yajing Kong +4 more
semanticscholar +1 more source
Actions as Language: Fine-Tuning VLMs into VLAs Without Catastrophic Forgetting
arXiv.orgFine-tuning vision-language models (VLMs) on robot teleoperation data to create vision-language-action (VLA) models is a promising paradigm for training generalist policies, but it suffers from a fundamental tradeoff: learning to produce actions often ...
Asher Hancock +4 more
semanticscholar +1 more source
Interspeech
We introduce Speech-IFeval, an evaluation framework designed to assess instruction-following capabilities and quantify catastrophic forgetting in speech-aware language models (SLMs).
Ke-Han Lu, Chun-Yi Kuan, Hung-yi Lee
semanticscholar +1 more source
We introduce Speech-IFeval, an evaluation framework designed to assess instruction-following capabilities and quantify catastrophic forgetting in speech-aware language models (SLMs).
Ke-Han Lu, Chun-Yi Kuan, Hung-yi Lee
semanticscholar +1 more source

