Results 91 to 100 of about 33,100 (246)
Overcoming catastrophic forgetting in neural networks
Catastrophic forgetting is the primary challenge that hinders continual learning, which refers to a neural network ability to sequentially learn multiple tasks while retaining previously acquired knowledge. Elastic Weight Consolidation, a regularization-based approach inspired by synaptic consolidation in biological neural systems, has been used to ...
Brandon Shuen Yi Loke +4 more
openaire +2 more sources
AT‐AER: Adversarial Training With Adaptive Example Reuse
ABSTRACT Adversarial training (AT) is widely regarded as a crucial defense method for deep neural networks against adversarial attacks. Most of the existing AT methods suffer from the problems of insufficient coverage of perturbation space and robust overfitting.
Meng Hu +5 more
wiley +1 more source
Remote sensing image semantic segmentation faces substantial challenges in training and transferring models across images with varying resolutions. This issue can be effectively mitigated by continuously learning knowledge derived from new resolutions ...
Penglong Li +3 more
doaj +1 more source
ABSTRACT Parameter‐efficient fine‐tuning (PEFT) has become a crucial paradigm for domain adaptation, achieving strong performance by updating only a small fraction of model parameters. Among various PEFT methods, low‐rank adaptation (LoRA) is widely adopted due to its structural simplicity and computational efficiency.
Xu Luo +4 more
wiley +1 more source
Continual Classification Learning Using Generative Models
Continual learning is the ability to sequentially learn over time by accommodating knowledge while retaining previously learned experiences. Neural networks can learn multiple tasks when trained on them jointly, but cannot maintain performance on ...
Gregorova, Magda +3 more
core
In recent decades, solid waste has proliferated worldwide, becoming a pressing global issue. This article explores the role of Indigenous people dwelling within and upon emerging waste scenarios, with a specific focus on involved forms of sociality and ontological contestation. Drawing on the case of a municipal landfill sited on a Guarani community in
Vanesa Martín Galán
wiley +1 more source
EXACFS - A CIL Method to Mitigate Catastrophic Forgetting
Deep neural networks (DNNS) excel at learning from static datasets but struggle with continual learning, where data arrives sequentially. Catastrophic forgetting, the phenomenon of forgetting previously learned knowledge, is a primary challenge. This paper introduces EXponentially Averaged Class-wise Feature Significance (EXACFS) to mitigate this issue
S. Balasubramanian 0001 +6 more
openaire +3 more sources
What does it take to turn a tool into a talking tool and that into an ultimate authority? Generative artificial intelligence (GenAI) in its diverse forms, such as large language models (LLMs), is celebrated as a useful tool. But LLM‐based conversational agents, or chatbots, the software applications through which ordinary users are likely to engage ...
Webb Keane
wiley +1 more source
Continual Test-Time Adaptation for Robust Remote Photoplethysmography Estimation
Remote photoplethysmography (rPPG) estimation has made considerable progress by leveraging deep learning, yet its performance remains highly susceptible to the domain shifts caused by lighting, skin tone and movement, particularly during inference ...
Hyunwoo Lee +3 more
doaj +1 more source
Abstract Based on an analysis of the Old Literary Tibetan corpus—a corpus of the oldest documented Tibetic language—the present study provides evidence that literary Tibetan v3 verb stems (commonly termed ‘future’) initially encoded passive voice. New arguments put forward in this article range from Trans‐Himalayan nominal morphology to early Tibetan ...
Joanna Bialek
wiley +1 more source

