Results 61 to 70 of about 33,100 (246)
Advances and Perspectives in Graphene‐Based Quantum Dots Enabled Neuromorphic Devices
Graphene‐based QDs are zero‐dimensional carbon nanomaterials with pronounced quantum confinement and tunable electronic structures. Herein, we summarize their synthesis strategies and functionalization methods, and highlight their functional roles and operating mechanisms in devices, as well as recent advances in neuromorphic electronics. We anticipate
Yulin Zhen +9 more
wiley +1 more source
Cosine Prompt-Based Class Incremental Semantic Segmentation for Point Clouds
Although current 3D semantic segmentation methods have achieved significant success, they suffer from catastrophic forgetting when confronted with dynamic, open environments.
Lei Guo +5 more
doaj +1 more source
PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning
This paper presents a method for adding multiple tasks to a single deep neural network while avoiding catastrophic forgetting. Inspired by network pruning techniques, we exploit redundancies in large deep networks to free up parameters that can then be ...
Lazebnik, Svetlana, Mallya, Arun
core +1 more source
Machine learning interatomic potentials bridge quantum accuracy and computational efficiency for materials discovery. Architectures from Gaussian process regression to equivariant graph neural networks, training strategies including active learning and foundation models, and applications in solid‐state electrolytes, batteries, electrocatalysts ...
In Kee Park +19 more
wiley +1 more source
This study presents a novel framework that enhances the reliability of DNS traffic monitoring using a hybrid long short‐term memory‐deep neural network (LSMT‐DNN) architecture, enabling robust detection of adversarial DNS tunneling. The proposed framework leverages feature extraction from DNS traffic patterns, including domain request sequences, query ...
Ahmad Almadhor +5 more
wiley +1 more source
Solutions to the Catastrophic Forgetting Problem [PDF]
In this paper we review three kinds of proposed solutions to the catastrophic forgetting problem in neural networks. The solutions are based on reducing hidden unit overlap, rehearsal, and pseudorehearsal mechanisms. We compare the methods and identify some underlying similarities.
openaire +1 more source
Comparing the Latent Features of Universal Machine‐Learning Interatomic Potentials
This study quantitatively assesses how universal machine‐learning interatomic potentials encode the chemical space into latent features, showing unique model‐specific representations with high cross‐model reconstruction errors. It explores how training datasets, protocols, and targets affect these encodings.
Sofiia Chorna +5 more
wiley +1 more source
An Appraisal of Incremental Learning Methods
As a special case of machine learning, incremental learning can acquire useful knowledge from incoming data continuously while it does not need to access the original data.
Yong Luo +3 more
doaj +1 more source
Abstract Wellbeing in higher education (HE) in the United Kingdom has been increasingly prioritised for many institutions, with a growing demand for student support requests. There are various determinants in life that can influence mental health. As such, protected characteristics, including race, can indicate that students who are Black or Asian ...
Amy Bywater, Helen Keane
wiley +1 more source
Continual learning and catastrophic forgetting
Preprint of a book chapter; 21 pages, 4 ...
Gido M. van de Ven +2 more
openaire +3 more sources

