Results 21 to 30 of about 5,803,394 (295)

Media multitasking and implicit learning [PDF]

open access: yesAttention, Perception, & Psychophysics, 2017
Media multitasking refers to the simultaneous use of different forms of media. Previous research comparing heavy media multitaskers and light media multitaskers suggests that heavy media multitaskers have a broader scope of attention. The present study explored whether these differences in attentional scope would lead to a greater degree of implicit ...
Kathleen S, Edwards, Myoungju, Shin
openaire   +2 more sources

Identification of Negative Transfers in Multitask Learning Using Surrogate Models [PDF]

open access: yesTrans. Mach. Learn. Res., 2023
Multitask learning is widely used in practice to train a low-resource target task by augmenting it with multiple related source tasks. Yet, naively combining all the source tasks with a target task does not always improve the prediction performance for ...
Dongyue Li   +2 more
semanticscholar   +1 more source

Embodied Multimodal Multitask Learning [PDF]

open access: yesProceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, 2020
Visually-grounded embodied language learning models have recently shown to be effective at learning multiple multimodal tasks such as following navigational instructions and answering questions. In this paper, we address two key limitations of these models, (a) the inability to transfer the grounded knowledge across different tasks and (b) the ...
Chaplot, Devendra Singh   +4 more
openaire   +2 more sources

Outcome and biomarker supervised deep learning for survival prediction in two multicenter breast cancer series

open access: yesJournal of Pathology Informatics, 2022
Background: Prediction of clinical outcomes for individual cancer patients is an important step in the disease diagnosis and subsequently guides the treatment and patient counseling.
Dmitrii Bychkov   +12 more
doaj   +3 more sources

Bioinspired Architecture Selection for Multitask Learning [PDF]

open access: yesFrontiers in Neuroinformatics, 2017
Faced with a new concept to learn, our brain does not work in isolation. It uses all previously learned knowledge. In addition, the brain is able to isolate the knowledge that does not benefit us, and to use what is actually useful. In machine learning, we do not usually benefit from the knowledge of other learned tasks. However, there is a methodology
Bueno Crespo, Andrés   +3 more
openaire   +4 more sources

Electricity, Heat, and Gas Load Forecasting Based on Deep Multitask Learning in Industrial-Park Integrated Energy System

open access: yesEntropy, 2020
Different energy systems are closely connected with each other in industrial-park integrated energy system (IES). The energy demand forecasting has important impact on IES dispatching and planning.
Linjuan Zhang   +3 more
doaj   +1 more source

Inferring latent task structure for Multitask Learning by Multiple Kernel Learning

open access: yesBMC Bioinformatics, 2010
Background The lack of sufficient training data is the limiting factor for many Machine Learning applications in Computational Biology. If data is available for several different but related problem domains, Multitask Learning algorithms can be used to ...
Altun Yasemin   +3 more
doaj   +1 more source

Hyperspectral Image Target Detection via Weighted Joint K-Nearest Neighbor and Multitask Learning Sparse Representation

open access: yesIEEE Access, 2020
Multitask sparse representation method improves the detection performance by constructing multiple associated sub-sparse representation tasks and jointly learning multiple sub-sparse representation tasks, and this method can make use of the spectral ...
Xianfeng Ou   +6 more
doaj   +1 more source

Hierarchical Classification of Insects with Multitask Learning and Anomaly Detection

open access: yesbioRxiv, 2023
Cameras and computer vision are revolutionising the study of insects, creating new research opportunities within agriculture, epidemiology, evolution, ecology and monitoring of biodiversity.
K. Bjerge   +6 more
semanticscholar   +1 more source

UniT: Multimodal Multitask Learning with a Unified Transformer [PDF]

open access: yesIEEE International Conference on Computer Vision, 2021
We propose UniT, a Unified Transformer model to simultaneously learn the most prominent tasks across different domains, ranging from object detection to natural language understanding and multimodal reasoning.
Ronghang Hu, Amanpreet Singh
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy