Results 101 to 110 of about 5,885,991 (322)
Forward Models: Supervised Learning with a Distal Teacher [PDF]
Michael I. Jordan, David E. Rumelhart
openalex +2 more sources
A General Approach for Achieving Supervised Subspace Learning in Sparse Representation
Over the past few decades, a large family of subspace learning algorithms based on dictionary learning have been designed to provide different solutions to learn subspace feature.
Jianshun Sang+2 more
doaj +1 more source
Parental Guidance and Supervised Learning [PDF]
We propose a simple theoretical model of supervised learning that is potentially useful to interpret a number of empirical phenomena relevant to the nature-nurture debate. The model captures a basic trade-o between sheltering the child from the consequences of his mistakes, and allowing him to learn from experience.
Alessandro Lizzeri, Marciano Siniscalchi
openaire +5 more sources
ICU‐EEG Pattern Detection by a Convolutional Neural Network
ABSTRACT Objective Patients in the intensive care unit (ICU) often require continuous EEG (cEEG) monitoring due to the high risk of seizures and rhythmic and periodic patterns (RPPs). However, interpreting cEEG in real time is resource‐intensive and heavily relies on specialized expertise, which is not always available.
Giulio Degano+5 more
wiley +1 more source
Objective Cognitive impairment (CI) is common in patients with systemic lupus erythematosus (SLE). Despite its prevalence, the immune mechanisms are not well understood. We previously reported elevated serum levels of S100A8/A9 and matrix metalloproteinase 9 (MMP‐9) in patients with SLE and CI.
Carolina Muñoz‐Grajales+18 more
wiley +1 more source
Adversarial Dropout for Supervised and Semi-Supervised Learning
Recently, training with adversarial examples, which are generated by adding a small but worst-case perturbation on input examples, has improved the generalization performance of neural networks. In contrast to the biased individual inputs to enhance the generality, this paper introduces adversarial dropout, which is a minimal set of ...
Park, Sungrae+3 more
openaire +2 more sources
Is Self-Supervised Learning More Robust Than Supervised Learning?
Self-supervised contrastive learning is a powerful tool to learn visual representation without labels. Prior work has primarily focused on evaluating the recognition accuracy of various pre-training algorithms, but has overlooked other behavioral aspects.
Zhong, Yuanyi+4 more
openaire +2 more sources
Objectives This study aims to develop hip morphology‐based radiographic hip osteoarthritis (RHOA) risk prediction models and investigates the added predictive value of hip morphology measurements and the generalizability to different populations. Methods We combined data from nine prospective cohort studies participating in the World COACH consortium ...
Myrthe A. van den Berg+26 more
wiley +1 more source
Using Genetic Algorithms for Supervised Concept Learning [PDF]
William M. Spears, Kenneth De Jong
openalex +2 more sources
Self-supervised pretraining has been observed to be effective at improving feature representations for transfer learning, leveraging large amounts of unlabelled data.
Blake VanBerlo+2 more
doaj +1 more source