Results 31 to 40 of about 1,489,144 (280)
Boosted multi-task learning [PDF]
In this paper we propose a novel algorithm for multi-task learning with boosted decision trees. We learn several different learning tasks with a joint model, explicitly addressing their commonalities through shared parameters and their differences with task-specific ones. This enables implicit data sharing and regularization.
Olivier Chapelle +5 more
openaire +1 more source
An efficient deep multi‐task learning structure for covid‐19 disease
COVID‐19 has had a profound global impact, necessitating the development of infection detection systems based on machine learning. This paper presents a Multi‐task architecture that addresses the classification and segmentation tasks for COVID‐19 ...
Shirin Kordnoori +3 more
doaj +1 more source
Understanding and Application of Multi-Task Learning in Medical Artificial Intelligence
In the medical field, artificial intelligence has been used in various ways with many developments. However, most artificial intelligence technologies are developed so that one model can perform only one task, which is a limitation in designing the ...
Young Jae Kim, Kwang Gi Kim
doaj +1 more source
Multiple object tracking based on multi‐task learning with strip attention
Multiple object tracking (MOT) framework based on bifurcate strategy was usually challenged by data association of different model path, which work for object localisation and appearance embedding independently. By incorporating the re‐identification (re‐
Yaoye Song +5 more
doaj +1 more source
Resource-Efficient Multi-Task Deep Learning Using a Multi-Path Network
Multi-task learning (MTL) improves learning efficiency compared to the single-task counterpart in that it performs multiple tasks at the same time. Due to the nature, it can achieve generalized performance as well as alleviate overfitting.
Soyeon Park, Jiho Lee, Eunwoo Kim
doaj +1 more source
Learning Output Kernels for Multi-Task Problems [PDF]
Simultaneously solving multiple related learning tasks is beneficial under a variety of circumstances, but the prior knowledge necessary to correctly model task relationships is rarely available in practice. In this paper, we develop a novel kernel-based
Dinuzzo, Francesco
core +1 more source
Multi-task learning (MTL) has achieved great success in various research domains, such as CV, NLP and IR etc. Due to the complex and competing task correlation, naive training all tasks may lead to inequitable learning, i.e. some tasks are learned well while others are overlooked.
Yuan, Jun, Zhang, Rui
openaire +2 more sources
Adaptive and robust multi-task learning
We study the multi-task learning problem that aims to simultaneously analyze multiple datasets collected from different sources and learn one model for each of them. We propose a family of adaptive methods that automatically utilize possible similarities among those tasks while carefully handling their differences.
Duan, Yaqi, Wang, Kaizheng
openaire +3 more sources
Multi-Task Multi-Sample Learning [PDF]
In the exemplar SVM (E-SVM) approach of Malisiewicz et al., ICCV 2011, an ensemble of SVMs is learnt, with each SVM trained independently using only a single positive sample and all negative samples for the class. In this paper we develop a multi-sample learning (MSL) model which enables joint regularization of the E-SVMs without any additional cost ...
Aytar, Y, Zisserman, A
openaire +1 more source
Geometry preserving multi-task metric learning [PDF]
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Yang, Peipei +2 more
openaire +1 more source

