Results 71 to 80 of about 5,803,394 (295)
SPARK decodes structure‐property relationships in anion exchange membranes (AEMs) via a chemically informed dual‐channel graph attention network (DEGAT) that explicitly captures microphase separation. It outputs five‐level grades for hydroxide conductivity and alkaline stability and highlights relevant key structural units, enabling robust pre ...
Wanting Chen +6 more
wiley +1 more source
Cost-Effective Multitask Active Learning in Wearable Sensor Systems
Multitask learning models provide benefits by reducing model complexity and improving accuracy by concurrently learning multiple tasks with shared representations.
Asiful Arefeen, Hassan Ghasemzadeh
doaj +1 more source
Lifelong Personalization via Gaussian Process Modeling for Long-Term HRI
Across a wide variety of domains, artificial agents that can adapt and personalize to users have potential to improve and transform how social services are provided.
Samuel Spaulding +3 more
doaj +1 more source
GPTune: multitask learning for autotuning exascale applications
Multitask learning has proven to be useful in the field of machine learning when additional knowledge is available to help a prediction task. We adapt this paradigm to develop autotuning frameworks, where the objective is to find the optimal performance ...
Yang Liu +6 more
semanticscholar +1 more source
WS2‐based in‐memory sensing reservoir computing integrates sensing, memory, and computation in one compact device. It achieves ∼94% N‐MNIST, ∼93% eye motion perception, and ∼89% speech recognition with ultra‐low energy (∼25.5 fJ/spike). The system shows stability at 95% humidity, endurance over 1.5M cycles, and supports synaptic plasticity, enabling ...
Dayanand Kumar +9 more
wiley +1 more source
Multitask Learning Based on Least Squares Support Vector Regression for Stock Forecast
Various factors make stock market forecasting difficult and arduous. Single-task learning models fail to achieve good results because they ignore the correlation between multiple related tasks. Multitask learning methods can capture the cross-correlation
Heng-Chang Zhang +3 more
doaj +1 more source
Consistent Multitask Learning with Nonlinear Output Relations [PDF]
Key to multitask learning is exploiting relationships between different tasks to improve prediction performance. If the relations are linear, regularization approaches can be used successfully.
Ciliberto, Carlo +3 more
core +2 more sources
Adaptation and learning over networks for nonlinear system modeling
In this chapter, we analyze nonlinear filtering problems in distributed environments, e.g., sensor networks or peer-to-peer protocols. In these scenarios, the agents in the environment receive measurements in a streaming fashion, and they are required to
Argyriou +52 more
core +2 more sources
Physical reservoir computing (PRC) based on spin wave interference has demonstrated high computational performance, yet room for improvement remains. In this study, we fabricated this concept PRC with eight detectors and evaluated the impact of the number of detectors using a chaotic time series prediction task.
Sota Hikasa +6 more
wiley +1 more source
Geolocation with Attention-Based Multitask Learning Models [PDF]
Geolocation, predicting the location of a post based on text and other information, has a huge potential for several social media applications. Typically, the problem is modeled as either multi-class classification or regression. In the first case, the classes are geographic areas previously identified; in the second, the models directly predict ...
Fornaciari, Tommaso, Hovy, Dirk
openaire +2 more sources

