Results 11 to 20 of about 28,537 (309)

Class-Wise Classifier Design Capable of Continual Learning Using Adaptive Resonance Theory-Based Topological Clustering

open access: yesApplied Sciences, 2023
This paper proposes a supervised classification algorithm capable of continual learning by utilizing an Adaptive Resonance Theory (ART)-based growing self-organizing clustering algorithm.
Naoki Masuyama   +3 more
doaj   +1 more source

Continual Barlow Twins: Continual Self-Supervised Learning for Remote Sensing Semantic Segmentation

open access: yesIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2023
In the field of earth observation (EO), continual learning (CL) algorithms have been proposed to deal with large datasets by decomposing them into several subsets and processing them incrementally.
Valerio Marsocci, Simone Scardapane
doaj   +1 more source

CLRS: Continual Learning Benchmark for Remote Sensing Image Scene Classification

open access: yesSensors, 2020
Remote sensing image scene classification has a high application value in the agricultural, military, as well as other fields. A large amount of remote sensing data is obtained every day. After learning the new batch data, scene classification algorithms
Haifeng Li   +6 more
doaj   +1 more source

Extensible Steganalysis via Continual Learning

open access: yesFractal and Fractional, 2022
To realize secure communication, steganography is usually implemented by embedding secret information into an image selected from a natural image dataset, in which the fractal images have occupied a considerable proportion.
Zhili Zhou   +3 more
doaj   +1 more source

Hebbian Continual Representation Learning

open access: yesProceedings of the Annual Hawaii International Conference on System Sciences, 2023
Continual Learning aims to bring machine learning into a more realistic scenario, where tasks are learned sequentially and the i.i.d. assumption is not preserved. Although this setting is natural for biological systems, it proves very difficult for machine learning models such as artificial neural networks.
Morawiecki, Pawel   +3 more
openaire   +4 more sources

Homeostasis-Inspired Continual Learning: Learning to Control Structural Regularization

open access: yesIEEE Access, 2021
Learning continually without forgetting might be one of the ultimate goals for building artificial intelligence (AI). However, unless there are enough resources equipped, forgetting knowledge acquired in the past is inevitable.
Joonyoung Kim   +3 more
doaj   +1 more source

Visual Tracking by Adaptive Continual Meta-Learning

open access: yesIEEE Access, 2022
We formulate the visual tracking problem as a semi-supervised continual learning problem, where only an initial frame is labeled. In contrast to conventional meta-learning based approaches that regard visual tracking as an instance detection problem with
Janghoon Choi   +4 more
doaj   +1 more source

Exploration of the continual learning ability that supports the application ecological evolution of the large-scale pretraining Peng Cheng series open source models

open access: yes智能科学与技术学报, 2022
Large-scale pre-training models have achieved great success in the field of natural language processing by using large-scale corpora and pre-training tasks.With the gradual development of large models, the continual learning ability of large models has ...
Yue YU   +5 more
doaj  

A review of continual learning for robotics

open access: yes智能科学与技术学报, 2022
One of the limitations of robotics is that it is difficult for robots to adapt to fickle tasks.A robot will inevitably forget the knowledge from old environments or tasks when facing new environments or tasks.In order to summarize research in continual ...
Chao ZHAO   +4 more
doaj  

Survey of Pre-training-based Continual Learning Methods (Invited) [PDF]

open access: yesJisuanji gongcheng
Traditional machine learning algorithms perform well only when the training and testing sets are identically distributed. They cannot perform incremental learning for new categories or tasks that were not present in the original training set.
LU Yue, ZHOU Xiangyu, ZHANG Shizhou, LIANG Guoqiang, XING Yinghui, CHENG De, ZHANG Yanning
doaj   +1 more source

Home - About - Disclaimer - Privacy