Results 21 to 30 of about 546,017 (306)

Self-Supervised Self-Supervision by Combining Deep Learning and Probabilistic Logic

open access: yesProceedings of the AAAI Conference on Artificial Intelligence, 2021
Labeling training examples at scale is a perennial challenge in machine learning. Self-supervision methods compensate for the lack of direct supervision by leveraging prior knowledge to automatically generate noisy labeled examples. Deep probabilistic logic (DPL) is a unifying framework for self-supervised learning that represents unknown labels as ...
Lang, Hunter, Poon, Hoifung
openaire   +2 more sources

Mixup Feature: A Pretext Task Self-Supervised Learning Method for Enhanced Visual Feature Learning

open access: yesIEEE Access, 2023
Self-supervised learning has emerged as an increasingly popular research topic within the field of computer vision. In this study, we propose a novel self-supervised learning approach based on Mixup features as pretext tasks.
Jiashu Xu, Sergii Stirenko
doaj   +1 more source

Enhancing IoT Network Security: Unveiling the Power of Self-Supervised Learning against DDoS Attacks

open access: yesSensors, 2023
The Internet of Things (IoT), projected to exceed 30 billion active device connections globally by 2025, presents an expansive attack surface. The frequent collection and dissemination of confidential data on these devices exposes them to significant ...
Josue Genaro Almaraz-Rivera   +2 more
doaj   +1 more source

Remote Sensing Image Scene Classification with Self-Supervised Learning Based on Partially Unlabeled Datasets

open access: yesRemote Sensing, 2022
In recent years, supervised learning, represented by deep learning, has shown good performance in remote sensing image scene classification with its powerful feature learning ability. However, this method requires large-scale and high-quality handcrafted
Xiliang Chen, Guobin Zhu, Mingqing Liu
doaj   +1 more source

Biased Self-supervised Learning for ASR

open access: yesINTERSPEECH 2023, 2023
Self-supervised learning via masked prediction pre-training (MPPT) has shown impressive performance on a range of speech-processing tasks. This paper proposes a method to bias self-supervised learning towards a specific task. The core idea is to slightly finetune the model that is used to obtain the target sequence. This leads to better performance and
Kreyssig, Florian L.   +5 more
openaire   +2 more sources

Benchmarking the Semi-Supervised Naïve Bayes Classifier [PDF]

open access: yes, 2015
Semi-supervised learning involves constructing predictive models with both labelled and unlabelled training data. The need for semi-supervised learning is driven by the fact that unlabelled data are often easy and cheap to obtain, whereas labelling data ...
Bagnall, Anthony   +2 more
core   +1 more source

Homomorphic Self-Supervised Learning

open access: yes, 2022
In this work, we observe that many existing self-supervised learning algorithms can be both unified and generalized when seen through the lens of equivariant representations. Specifically, we introduce a general framework we call Homomorphic Self-Supervised Learning, and theoretically show how it may subsume the use of input-augmentations provided an ...
Keller, T. Anderson   +2 more
openaire   +2 more sources

SSDL: Self-Supervised Dictionary Learning [PDF]

open access: yes2021 IEEE International Conference on Multimedia and Expo (ICME), 2021
The label-embedded dictionary learning (DL) algorithms generate influential dictionaries by introducing discriminative information. However, there exists a limitation: All the label-embedded DL methods rely on the labels due that this way merely achieves ideal performances in supervised learning.
Shao, Shuai   +5 more
openaire   +2 more sources

Self-Supervised Node Classification with Strategy and Actively Selected Labeled Set

open access: yesEntropy, 2022
To alleviate the impact of insufficient labels in less-labeled classification problems, self-supervised learning improves the performance of graph neural networks (GNNs) by focusing on the information of unlabeled nodes.
Yi Kang   +3 more
doaj   +1 more source

Credal Self-Supervised Learning

open access: yes, 2021
Self-training is an effective approach to semi-supervised learning. The key idea is to let the learner itself iteratively generate "pseudo-supervision" for unlabeled instances based on its current hypothesis. In combination with consistency regularization, pseudo-labeling has shown promising performance in various domains, for example in computer ...
Lienen, Julian, Hüllermeier, Eyke
openaire   +3 more sources

Home - About - Disclaimer - Privacy