Results 51 to 60 of about 115,093 (312)
Contrastive Hebbian Learning with Random Feedback Weights
Neural networks are commonly trained to make predictions through learning algorithms. Contrastive Hebbian learning, which is a powerful rule inspired by gradient backpropagation, is based on Hebb's rule and the contrastive divergence algorithm.
Bartley, Travis +2 more
core +1 more source
Mapping the evolution of mitochondrial complex I through structural variation
Respiratory complex I (CI) is crucial for bioenergetic metabolism in many prokaryotes and eukaryotes. It is composed of a conserved set of core subunits and additional accessory subunits that vary depending on the organism. Here, we categorize CI subunits from available structures to map the evolution of CI across eukaryotes. Respiratory complex I (CI)
Dong‐Woo Shin +2 more
wiley +1 more source
Conditional Restricted Boltzmann Machines for Structured Output Prediction [PDF]
Conditional Restricted Boltzmann Machines (CRBMs) are rich probabilistic models that have recently been applied to a wide range of problems, including collaborative filtering, classification, and modeling motion capture data. While much progress has been
Hinton, Geoffrey E. +2 more
core +1 more source
Phenotypic subtyping via contrastive learning
AbstractDefining and accounting for subphenotypic structure has the potential to increase statistical power and provide a deeper understanding of the heterogeneity in the molecular basis of complex disease. Existing phenotype subtyping methods primarily rely on clinically observed heterogeneity or metadata clustering.
Aditya Gorla +5 more
openaire +2 more sources
Adiabatic Persistent Contrastive Divergence learning [PDF]
22 pages, 2 ...
Jang, Hyeryung +3 more
openaire +2 more sources
Disordered but rhythmic—the role of intrinsic protein disorder in eukaryotic circadian timing
Unstructured domains known as intrinsically disordered regions (IDRs) are present in nearly every part of the eukaryotic core circadian oscillator. IDRs enable many diverse inter‐ and intramolecular interactions that support clock function. IDR conformations are highly tunable by post‐translational modifications and environmental conditions, which ...
Emery T. Usher, Jacqueline F. Pelham
wiley +1 more source
In recent years, contrastive learning has been a highly favored method for self-supervised representation learning, which significantly improves the unsupervised training of deep image models. Self-supervised learning is a subset of unsupervised learning
Bihi Sabiri +3 more
doaj +1 more source
Cross-modal Deep Metric Learning with Multi-task Regularization
DNN-based cross-modal retrieval has become a research hotspot, by which users can search results across various modalities like image and text. However, existing methods mainly focus on the pairwise correlation and reconstruction error of labeled data ...
Huang, Xin, Peng, Yuxin
core +1 more source
Liquid biopsy epigenetics: establishing a molecular profile based on cell‐free DNA
Cell‐free DNA (cfDNA) fragments in plasma from cancer patients carry epigenetic signatures reflecting their cells of origin. These epigenetic features include DNA methylation, nucleosome modifications, and variations in fragmentation. This review describes the biological properties of each feature and explores optimal strategies for harnessing cfDNA ...
Christoffer Trier Maansson +2 more
wiley +1 more source
With the continuous improvement in the volume and spatial resolution of remote sensing images, the self-supervised contrastive learning paradigm driven by a large amount of unlabeled data is expected to be a promising solution for large-scale land cover ...
Zhaoyang Zhang +4 more
doaj +1 more source

