Results 301 to 310 of about 115,093 (312)
Some of the next articles are maybe not open access.

Masked Contrastive Representation Learning for Reinforcement Learning

IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022
In pixel-based reinforcement learning (RL), the states are raw video frames, which are mapped into hidden representation before feeding to a policy network. To improve sample efficiency of state representation learning, recently, the most prominent work is based on contrastive unsupervised representation.
Jinhua, Zhu   +7 more
openaire   +2 more sources

Contrastive Multi-View Kernel Learning

IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023
Kernel method is a proven technique in multi-view learning. It implicitly defines a Hilbert space where samples can be linearly separated. Most kernel-based multi-view learning algorithms compute a kernel function aggregating and compressing the views into a single kernel.
Jiyuan Liu   +4 more
openaire   +2 more sources

Relation-aware Graph Contrastive Learning

Parallel Processing Letters, 2023
Over the past few years, graph contrastive learning (GCL) has gained great success in processing unlabeled graph-structured data, but most of the existing GCL methods are based on instance discrimination task which typically learns representations by minimizing the distance between two versions of the same instance.
Bingshi Li, Jin Li, Yang-Geng Fu
openaire   +2 more sources

Multitask Causal Contrastive Learning

IEEE Transactions on Neural Networks and Learning Systems
Multitask learning (MTL) aims to improve the performance of multiple tasks by sharing knowledge among multiple different tasks, which has attracted increasing interest and shown success in various fields. However, MTL often suffers from negative transfer since the model may utilize useless features and face interference among tasks' optimization ...
Chaoyang Li   +4 more
openaire   +2 more sources

Learning Mri Contrast-Agnostic Registration

2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI), 2021
We introduce a strategy for learning image registration without acquired imaging data, producing powerful networks agnostic to magnetic resonance imaging (MRI) contrast. While classical methods accurately estimate the spatial correspondence between images, they solve an optimization problem for every new image pair.
Malte, Hoffmann   +4 more
openaire   +2 more sources

Contrastive Dual Gating: Learning Sparse Features With Contrastive Learning

2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022
Jian Meng   +4 more
openaire   +1 more source

Multiscale Subgraph Adversarial Contrastive Learning

IEEE Transactions on Neural Networks and Learning Systems
Graph contrastive learning (GCL), as a typical self-supervised learning paradigm, has been able to achieve promising performance without labels and gradually attracts much attention. Graph-level method aims to learn representations of each graph by contrasting two augmented graphs.
Yanbei Liu   +6 more
openaire   +2 more sources

Pyramid contrastive learning for clustering

Neural Networks
With its ability of joint representation learning and clustering via deep neural networks, the deep clustering have gained significant attention in recent years. Despite the considerable progress, most of the previous deep clustering methods still suffer from three critical limitations.
Zi-Feng Zhou   +2 more
openaire   +2 more sources

Geometric Contrastive Learning

2023 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW), 2023
Yeskendir Koishekenov   +3 more
openaire   +1 more source

Contrastive learning in brain imaging

Computerized Medical Imaging and Graphics
Contrastive learning is a type of deep learning technique trying to classify data or examples without requiring data labeling. Instead, it learns about the most representative features that contrast positive and negative pairs of examples. In literature of contrastive learning, terms of positive examples and negative examples do not mean whether the ...
Xiaoyin Xu, Stephen T.C. Wong
openaire   +2 more sources

Home - About - Disclaimer - Privacy