Results 11 to 20 of about 115,093 (312)

Al-Takhlil al-Taqabuly fi Ta'lim al-Lughah al-'Arabiyyah

open access: yesJurnal Al Bayan: Jurnal Jurusan Pendidikan Bahasa Arab, 2020
This article explains systematically the nature of contrastive analysis in language learning. Throughout this article the writer investigates the development of contrastive analysis in the field of language learning, its main objectives, some hypotheses ...
Ahmad Bukhari Muslim
doaj   +1 more source

Grouped Contrastive Learning of Self-Supervised Sentence Representation

open access: yesApplied Sciences, 2023
This paper proposes a method called Grouped Contrastive Learning of self-supervised Sentence Representation (GCLSR), which can learn an effective and meaningful representation of sentences. Previous works maximize the similarity between two vectors to be
Qian Wang   +3 more
doaj   +1 more source

Efficient Learning for Undirected Topic Models [PDF]

open access: yes, 2015
Replicated Softmax model, a well-known undirected topic model, is powerful in extracting semantic representations of documents. Traditional learning strategies such as Contrastive Divergence are very inefficient.
Gu, Jiatao, Li, Victor O. K.
core   +2 more sources

Hyperbolic Contrastive Learning

open access: yes, 2023
Learning good image representations that are beneficial to downstream tasks is a challenging task in computer vision. As such, a wide variety of self-supervised learning approaches have been proposed. Among them, contrastive learning has shown competitive performance on several benchmark datasets.
Yue, Yun   +3 more
openaire   +2 more sources

Self-supervised Dynamic Graph Representation Learning Approach Based on Contrastive Prediction [PDF]

open access: yesJisuanji kexue, 2023
In recent years,graph self-supervised learning represented by graph contrastive learning has become a hot research to-pic in the field of graph learning.This learning paradigm does not depend on node labels and has good generalization ability.However ...
JIANG Linpu, CHEN Kejia
doaj   +1 more source

Stochastic Contrastive Learning

open access: yes, 2021
While state-of-the-art contrastive Self-Supervised Learning (SSL) models produce results competitive with their supervised counterparts, they lack the ability to infer latent variables. In contrast, prescribed latent variable (LV) models enable attributing uncertainty, inducing task specific compression, and in general allow for more interpretable ...
Ramapuram, Jason   +3 more
openaire   +2 more sources

SC-FGCL: Self-Adaptive Cluster-Based Federal Graph Contrastive Learning

open access: yesIEEE Open Journal of the Computer Society, 2023
As a self-supervised learning method, the graph contrastive learning achieve admirable performance in graph pre-training tasks, and can be fine-tuned for multiple downstream tasks such as protein structure prediction, social recommendation, etc.
Tingqi Wang   +4 more
doaj   +1 more source

Equivariant Contrastive Learning

open access: yes, 2021
In state-of-the-art self-supervised learning (SSL) pre-training produces semantically good representations by encouraging them to be invariant under meaningful transformations prescribed from human knowledge. In fact, the property of invariance is a trivial instance of a broader class called equivariance, which can be intuitively understood as the ...
Dangovski, Rumen   +7 more
openaire   +2 more sources

Contrastive Fairness in Machine Learning [PDF]

open access: yesIEEE Letters of the Computer Society, 2020
Was it fair that Harry was hired but not Barry? Was it fair that Pam was fired instead of Sam? How can one ensure fairness when an intelligent algorithm takes these decisions instead of a human? How can one ensure that the decisions were taken based on merit and not on protected attributes like race or sex? These are the questions that must be answered
Tapabrata Chakraborti   +2 more
openaire   +2 more sources

Signal Contrastive Enhanced Graph Collaborative Filtering for Recommendation

open access: yesData Science and Engineering, 2023
Graph collaborative filtering methods have shown great performance improvements compared with deep neural network-based models. However, these methods suffer from data sparsity and data noise problems.
Zhi-Yuan Li   +3 more
doaj   +1 more source

Home - About - Disclaimer - Privacy