Results 21 to 30 of about 339,627 (329)

Catastrophic Forgetting, Rehearsal and Pseudorehearsal [PDF]

open access: yesConnection Science, 1995
This paper reviews the problem of catastrophic forgetting (the loss or disruption of previously learned information when new information is learned) in neural networks, and explores rehearsal mechanisms (the retraining of some of the previously learned information as the new information is added) as a potential solution.
Anthony Robins
openaire   +3 more sources

EXACFS -- A CIL Method to mitigate Catastrophic Forgetting [PDF]

open access: greenProceedings of the Fifteenth Indian Conference on Computer Vision Graphics and Image Processing
Deep neural networks (DNNS) excel at learning from static datasets but struggle with continual learning, where data arrives sequentially. Catastrophic forgetting, the phenomenon of forgetting previously learned knowledge, is a primary challenge. This paper introduces EXponentially Averaged Class-wise Feature Significance (EXACFS) to mitigate this issue
Sundaram Balasubramanian   +6 more
openalex   +4 more sources

The Joint Effect of Task Similarity and Overparameterization on Catastrophic Forgetting -- An Analytical Model [PDF]

open access: greenInternational Conference on Learning Representations
In continual learning, catastrophic forgetting is affected by multiple aspects of the tasks. Previous works have analyzed separately how forgetting is affected by either task similarity or overparameterization.
Daniel Goldfarb   +4 more
openalex   +2 more sources

Bayesian continual learning and forgetting in neural networks [PDF]

open access: yesNature Communications
Biological synapses effortlessly balance memory retention and flexibility, yet artificial neural networks still struggle with the extremes of catastrophic forgetting and catastrophic remembering.
Djohan Bonnet   +6 more
doaj   +2 more sources

MuseumMaker: Continual Style Customization Without Catastrophic Forgetting

open access: yesIEEE Transactions on Image Processing
Pre-trained large text-to-image (T2I) models with an appropriate text prompt has attracted growing interests in customized images generation field. However, catastrophic forgetting issue make it hard to continually synthesize new user-provided styles while retaining the satisfying results amongst learned styles. In this paper, we propose MuseumMaker, a
Chenxi Liu   +5 more
openaire   +3 more sources

Mitigating Catastrophic Forgetting in Pest Detection Through Adaptive Response Distillation [PDF]

open access: goldAgriculture
Pest detection in agriculture faces the challenge of adapting to new pest species while preserving the ability to recognize previously learned ones. Traditional model fine-tuning approaches often result in catastrophic forgetting, where the acquisition ...
Hongjun Zhang   +3 more
doaj   +2 more sources

ZeroFlow: Overcoming Catastrophic Forgetting is Easier than You Think [PDF]

open access: greenInternational Conference on Machine Learning
Backpropagation provides a generalized configuration for overcoming catastrophic forgetting. Optimizers such as SGD and Adam are commonly used for weight updates in continual learning and continual pre-training. However, access to gradient information is
Tao Feng   +6 more
openalex   +2 more sources

Overcoming catastrophic forgetting in neural networks

open access: yesarXiv.org
Catastrophic forgetting is the primary challenge that hinders continual learning, which refers to a neural network ability to sequentially learn multiple tasks while retaining previously acquired knowledge. Elastic Weight Consolidation, a regularization-based approach inspired by synaptic consolidation in biological neural systems, has been used to ...
Loke, Brandon Shuen Yi   +4 more
openaire   +3 more sources

Zero-shot incremental learning using spatial-frequency feature representations [PDF]

open access: yesScientific Reports
Zero-shot incremental learning aims to enable a model to generalize to new classes without forgetting previously learned classes. However, the semantic gap between old and new sample classes can lead to catastrophic forgetting.
Jie Ren   +3 more
doaj   +2 more sources

SD-IDD: Selective Distillation for Incremental Defect Detection [PDF]

open access: yesSensors
Surface defects in industrial production are complex and diverse. Therefore, deep learning-based defect detection models must consistently adapt to newly emerging defect categories. The trained models generally suffer from catastrophic forgetting as they
Jing Li   +3 more
doaj   +2 more sources

Home - About - Disclaimer - Privacy