Results 211 to 220 of about 33,100 (246)

FM-LoRA: Factorized Low-Rank Meta-Prompting for Continual Learning. [PDF]

open access: yesConf Comput Vis Pattern Recognit Workshops
Yu X, Yang J, Wu X, Qiu P, Liu X.
europepmc   +1 more source

Catastrophic Forgetting, Rehearsal and Pseudorehearsal [PDF]

open access: yesConnection Science, 1995
This paper reviews the problem of catastrophic forgetting (the loss or disruption of previously learned information when new information is learned) in neural networks, and explores rehearsal mechanisms (the retraining of some of the previously learned information as the new information is added) as a potential solution.
Anthony Robins
exaly   +2 more sources

Catastrophic forgetting in connectionist networks

Trends in Cognitive Sciences, 1999
All natural cognitive systems, and, in particular, our own, gradually forget previously learned information. Plausible models of human cognition should therefore exhibit similar patterns of gradual forgetting of old information as new information is acquired.
exaly   +3 more sources

Catastrophic forgetting and mode collapse in GANs

2020 International Joint Conference on Neural Networks (IJCNN), 2020
In this paper, we show that Generative Adversarial Networks (GANs) suffer from catastrophic forgetting even when they are trained to approximate a single target distribution. We show that GAN training is a continual learning problem in which the sequence of changing model distributions is the sequence of tasks to the discriminator.
Hoang Thanh-Tung, Truyen Tran 0001
openaire   +1 more source

Avoiding Catastrophic Forgetting

Trends in Cognitive Sciences, 2017
Humans regularly perform new learning without losing memory for previous information, but neural network models suffer from the phenomenon of catastrophic forgetting in which new learning impairs prior function. A recent article presents an algorithm that spares learning at synapses important for previously learned function, reducing catastrophic ...
openaire   +2 more sources

Measuring Catastrophic Forgetting in Visual Question Answering

2021
Catastrophic forgetting is a ubiquitous problem for the current generation of Artificial Neural Networks: When a network is asked to learn multiple tasks in a sequence, it fails dramatically as it tends to forget past knowledge. Little is known on how far multimodal conversational agents suffer from this phenomenon.
Claudio Greco 0002   +3 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy