Overcoming Catastrophic Forgetting With Unlabeled Data in the Wild [PDF]
ICCV 2019; v3 updated Figure ...
Kibok Lee 0003 +3 more
openaire +2 more sources
How catastrophic can catastrophic forgetting be in linear regression?
To better understand catastrophic forgetting, we study fitting an overparameterized linear model to a sequence of tasks with different input distributions. We analyze how much the model forgets the true labels of earlier tasks after training on subsequent tasks, obtaining exact expressions and bounds. We establish connections between continual learning
Itay Evron +4 more
openaire +3 more sources
Unified Probabilistic Deep Continual Learning through Generative Replay and Open Set Recognition
We introduce a probabilistic approach to unify open set recognition with the prevention of catastrophic forgetting in deep continual learning, based on variational Bayesian inference.
Hong, Yong Won +4 more
core +1 more source
Incremental Learning With Adaptive Model Search and a Nominal Loss Model
This paper addresses an incremental learning problem, in which tasks are learned sequentially without access to the previously trained dataset. Catastrophic forgetting is a significant bottleneck to incremental learning as the network performs poorly on ...
Chanho Ahn, Eunwoo Kim, Songhwai Oh
doaj +1 more source
Continual Learning Objective for Analyzing Complex Knowledge Representations
Human beings tend to incrementally learn from the rapidly changing environment without comprising or forgetting the already learned representations. Although deep learning also has the potential to mimic such human behaviors to some extent, it suffers ...
Asad Mansoor Khan +4 more
doaj +1 more source
On the role of neurogenesis in overcoming catastrophic forgetting
Lifelong learning capabilities are crucial for artificial autonomous agents operating on real-world data, which is typically non-stationary and temporally correlated. In this work, we demonstrate that dynamically grown networks outperform static networks in incremental learning scenarios, even when bounded by the same amount of memory in both cases ...
German Ignacio Parisi +2 more
openaire +2 more sources
Investigating Catastrophic Forgetting of Deep Learning Models Within Office 31 Dataset
Deep learning models have shown impressive performance in various tasks. However, they are prone to a phenomenon called catastrophic forgetting. This means they do not remember what they have learned when training on new tasks. In this research paper, we
Hidayaturrahman +3 more
doaj +1 more source
Catastrophic Importance of Catastrophic Forgetting
This paper describes some of the possibilities of artificial neural networks that open up after solving the problem of catastrophic forgetting. A simple model and reinforcement learning applications of existing methods are also proposed.
openaire +2 more sources
Overcoming Catastrophic Forgetting in Graph Neural Networks
Catastrophic forgetting refers to the tendency that a neural network ``forgets'' the previous learned knowledge upon learning new tasks. Prior methods have been focused on overcoming this problem on convolutional neural networks (CNNs), where the input samples like images lie in a grid domain, but have largely overlooked graph neural networks (GNNs ...
Huihui Liu, Yiding Yang, Xinchao Wang
openaire +2 more sources
Neural modularity helps organisms evolve to learn new skills without forgetting old skills. [PDF]
A long-standing goal in artificial intelligence is creating agents that can learn a variety of different skills for different problems. In the artificial intelligence subfield of neural networks, a barrier to that goal is that when agents learn a new ...
Kai Olav Ellefsen +2 more
doaj +1 more source

