Results 251 to 260 of about 1,489,144 (280)
Some of the next articles are maybe not open access.
2018
Traditional supervised machine learning methods involve learning mapping function that can accurately map input data to output label. However, real-world datasets are complex, and we often encounter situations where multiple tasks (classes) are related to each other.
Azad Naik, Huzefa Rangwala
openaire +1 more source
Traditional supervised machine learning methods involve learning mapping function that can accurately map input data to output label. However, real-world datasets are complex, and we often encounter situations where multiple tasks (classes) are related to each other.
Azad Naik, Huzefa Rangwala
openaire +1 more source
Calibrated Multi-Task Learning
Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2018This paper proposes a novel algorithm, named Non-Convex Calibrated Multi-Task Learning (NC-CMTL), for learning multiple related regression tasks jointly. Instead of utilizing the nuclear norm, NC-CMTL adopts a non-convex low rank regularizer to explore the shared information among different tasks.
Feiping Nie, Zhanxuan Hu, Xuelong Li
openaire +1 more source
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining, 2004Past empirical work has shown that learning multiple related tasks from data simultaneously can be advantageous in terms of predictive performance relative to learning these tasks independently. In this paper we present an approach to multi--task learning based on the minimization of regularization functionals similar to existing ones, such as the one ...
Theodoros Evgeniou, Massimiliano Pontil
openaire +1 more source
Federated Multi-task Graph Learning
ACM Transactions on Intelligent Systems and Technology, 2022Distributed processing and analysis of large-scale graph data remain challenging because of the high-level discrepancy among graphs. This study investigates a novel subproblem: the distributed multi-task learning on the graph, which jointly learns multiple analysis tasks from decentralized graphs.
Yijing Liu +5 more
openaire +1 more source
2015 IEEE International Conference on Data Mining, 2015
In this paper, we develop parallel algorithms for a family of regularized multi-task methods which can model task relations under the regularization framework. Since those multi-task methods cannot be parallelized directly, we use the FISTA algorithm, which in each iteration constructs a surrogate function of the original problem by utilizing the ...
openaire +1 more source
In this paper, we develop parallel algorithms for a family of regularized multi-task methods which can model task relations under the regularization framework. Since those multi-task methods cannot be parallelized directly, we use the FISTA algorithm, which in each iteration constructs a surrogate function of the original problem by utilizing the ...
openaire +1 more source
Manifold Regularized Multi-Task Learning
2012Multi-task learning (MTL) has drawn a lot of attentions in machine learning. By training multiple tasks simultaneously, information can be better shared across tasks. This leads to significant performance improvement in many problems. However, most existing methods assume that all tasks are related or their relationship follows a simple and specified ...
Peipei Yang +3 more
openaire +1 more source
A Survey on Multi-Task Learning
IEEE Transactions on Knowledge and Data Engineering, 2022Yu Zhang, Qiang Yang
exaly
Multi-Task Learning for Dense Prediction Tasks: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021Simon Vandenhende +2 more
exaly

