Results 1 to 10 of about 11,928,790 (229)

Active paper for active learning [PDF]

open access: hybridResearch in Learning Technology, 1998
Paper documents have great advantages in readability, portability and familiarity, but are necessarily static and slow to update. Much recent research has concentrated on the dynamic demonstrations, immediate feedback, and easy updating that can be ...
Heather Brown   +5 more
doaj   +6 more sources

Deep Bayesian Active Learning with Image Data [PDF]

open access: yesarXiv, 2017
Even though active learning forms an important pillar of machine learning, deep learning tools are not prevalent within it. Deep learning poses several difficulties when used in an active learning setting. First, active learning (AL) methods generally rely on being able to learn and update models from small amounts of data.
Y. Gal, Riashat Islam, Zoubin Ghahramani
arxiv   +2 more sources

Mind Your Outliers! Investigating the Negative Impact of Outliers on Active Learning for Visual Question Answering [PDF]

open access: yesarXiv, 2021
Active learning promises to alleviate the massive data needs of supervised machine learning: it has successfully improved sample efficiency by an order of magnitude on traditional tasks like topic classification and object recognition. However, we uncover a striking contrast to this promise: across 5 models and 4 datasets on the task of visual question
Siddharth Karamcheti   +3 more
arxiv   +3 more sources

Learning a Policy for Opportunistic Active Learning [PDF]

open access: yesEMNLP 2018, 2018
Active learning identifies data points to label that are expected to be the most useful in improving a supervised model. Opportunistic active learning incorporates active learning into interactive tasks that constrain possible queries during interactions.
Mooney, Raymond J.   +2 more
arxiv   +3 more sources

A Bayesian Approach toward Active Learning for Collaborative Filtering [PDF]

open access: yesarXiv, 2012
Collaborative filtering is a useful technique for exploiting the preference patterns of a group of users to predict the utility of items for the active user. In general, the performance of collaborative filtering depends on the number of rated examples given by the active user.
Jin, Rong, Si, Luo
arxiv   +2 more sources

Active learning for data streams: a survey [PDF]

open access: yesMachine-mediated learning, 2023
Online active learning is a paradigm in machine learning that aims to select the most informative data points to label from a data stream. The problem of minimizing the cost associated with collecting labeled observations has gained a lot of attention in
Davide Cacciarelli, M. Kulahci
semanticscholar   +1 more source

Active Learning by Acquiring Contrastive Examples [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2021
Common acquisition functions for active learning use either uncertainty or diversity sampling, aiming to select difficult and diverse data points from the pool of unlabeled data, respectively.
Katerina Margatina   +3 more
semanticscholar   +1 more source

A Comparative Survey of Deep Active Learning [PDF]

open access: yesarXiv.org, 2022
While deep learning (DL) is data-hungry and usually relies on extensive labeled data to deliver good performance, Active Learning (AL) reduces labeling costs by selecting a small proportion of samples from unlabeled data for labeling and training ...
Xueying Zhan   +5 more
semanticscholar   +1 more source

Active Learning by Feature Mixing [PDF]

open access: yesComputer Vision and Pattern Recognition, 2022
The promise of active learning (AL) is to reduce labelling costs by selecting the most valuable examples to annotate from a pool of unlabelled data. Identifying these examples is especially challenging with high-dimensional data (e.g. images, videos) and
Amin Parvaneh   +5 more
semanticscholar   +1 more source

A Survey of Deep Active Learning [PDF]

open access: yesACM Computing Surveys, 2020
Active learning (AL) attempts to maximize a model’s performance gain while annotating the fewest samples possible. Deep learning (DL) is greedy for data and requires a large amount of data supply to optimize a massive number of parameters if the model is
Pengzhen Ren   +6 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy