Results 221 to 230 of about 41,355 (238)
Some of the next articles are maybe not open access.
2015
In this chapter, we give an overview of different approaches developed in semi-supervised learning, as well as different assumptions leading to these methods. We particularly consider the margin as an indicator of confidence which constitutes the working hypothesis of algorithms that search the decision boundary on low density regions.
Massih-Reza Amini, Nicolas Usunier
openaire +1 more source
In this chapter, we give an overview of different approaches developed in semi-supervised learning, as well as different assumptions leading to these methods. We particularly consider the margin as an indicator of confidence which constitutes the working hypothesis of algorithms that search the decision boundary on low density regions.
Massih-Reza Amini, Nicolas Usunier
openaire +1 more source
2013
In traditional supervised learning, one uses ”labeled” data to build a model. However, labeling the training data for real-world applications is difficult, expensive, or time consuming, as it requires the effort of human annotators sometimes with specific domain experience and training.
Mohamed Farouk Abdel Hady +1 more
openaire +2 more sources
In traditional supervised learning, one uses ”labeled” data to build a model. However, labeling the training data for real-world applications is difficult, expensive, or time consuming, as it requires the effort of human annotators sometimes with specific domain experience and training.
Mohamed Farouk Abdel Hady +1 more
openaire +2 more sources
Semi-Supervised Bilinear Subspace Learning
IEEE Transactions on Image Processing, 2009Recent research has demonstrated the success of tensor based subspace learning in both unsupervised and supervised configurations (e.g., 2-D PCA, 2-D LDA, and DATER). In this correspondence, we present a new semi-supervised subspace learning algorithm by integrating the tensor representation and the complementary information conveyed by unlabeled data.
Dong, Xu, Shuicheng, Yan
openaire +2 more sources
2018 24th International Conference on Pattern Recognition (ICPR), 2018
Convolutional neural networks (CNNs) attain state-of-the-art performance on various classification tasks assuming a sufficiently large number of labeled training examples. Unfortunately, curating sufficiently large labeled training dataset requires human involvement, which is expensive and time consuming.
Zeyad Hailat +2 more
openaire +1 more source
Convolutional neural networks (CNNs) attain state-of-the-art performance on various classification tasks assuming a sufficiently large number of labeled training examples. Unfortunately, curating sufficiently large labeled training dataset requires human involvement, which is expensive and time consuming.
Zeyad Hailat +2 more
openaire +1 more source
2020
This chapter is concerned with the advanced supervised learning, including the semi-supervised learning. We start with the three versions of the Kohonen Networks: the unsupervised version, the supervised version, and the semi-supervised version.
openaire +1 more source
This chapter is concerned with the advanced supervised learning, including the semi-supervised learning. We start with the three versions of the Kohonen Networks: the unsupervised version, the supervised version, and the semi-supervised version.
openaire +1 more source
Privileged Semi-Supervised Learning
2018 25th IEEE International Conference on Image Processing (ICIP), 2018Semi-Supervised Learning (SSL) aims to leverage unlabeled data to improve performance. Due to the lack of supervised information, previous works mainly focus on how to utilize the available unlabeled data to improve the training quality. However, the estimation of the data distribution revealed by the unlabeled examples might not be accurate as their ...
Xingyu Chen +4 more
openaire +1 more source
When Semi-supervised Learning Meets Ensemble Learning
Frontiers of Electrical and Electronic Engineering in China, 2009Semi-supervised learning and ensemble learning are two important machine learning paradigms. The former attempts to achieve strong generalization by exploiting unlabeled data; the latter attempts to achieve strong generalization by using multiple learners.
openaire +1 more source
Budget Semi-supervised Learning
2009In this paper we propose to study budget semi-supervised learning , i.e., semi-supervised learning with a resource budget, such as a limited memory insufficient to accommodate and/or process all available unlabeled data. This setting is with practical importance because in most real scenarios although there may exist abundant unlabeled data, the ...
Zhi-Hua Zhou +3 more
openaire +1 more source
2005
For many classification problems, unlabeled training data are inexpensive and readily available, whereas labeling training data imposes costs. Semi-supervised classification algorithms aim at utilizing information contained in unlabeled data in addition to the (few) labeled data.
openaire +1 more source
For many classification problems, unlabeled training data are inexpensive and readily available, whereas labeling training data imposes costs. Semi-supervised classification algorithms aim at utilizing information contained in unlabeled data in addition to the (few) labeled data.
openaire +1 more source

