Results 311 to 320 of about 14,062,463 (376)
An improved lightweight method based on EfficientNet for birdsong recognition. [PDF]
He H, Luo H.
europepmc +1 more source
Multimodal Deep Learning for Stage Classification of Head and Neck Cancer Using Masked Autoencoders and Vision Transformers with Attention-Based Fusion. [PDF]
Turki A, Alshabrawy O, Woo WL.
europepmc +1 more source
Attention residual network for medical ultrasound image segmentation. [PDF]
Liu H+7 more
europepmc +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
SSRN Electronic Journal, 2023
Attention to the economy plays a key role in canonical macro models, yet its empirical properties are not well understood. We collect novel measures of attention to the economy based on open-ended survey questions. Our measures are included in tailored panel surveys of German firms and households, conducted before and during a large shock to inflation.
Link, Sebastian+3 more
openaire +5 more sources
Attention to the economy plays a key role in canonical macro models, yet its empirical properties are not well understood. We collect novel measures of attention to the economy based on open-ended survey questions. Our measures are included in tailored panel surveys of German firms and households, conducted before and during a large shock to inflation.
Link, Sebastian+3 more
openaire +5 more sources
CBAM: Convolutional Block Attention Module
European Conference on Computer Vision, 2018We propose Convolutional Block Attention Module (CBAM), a simple yet effective attention module for feed-forward convolutional neural networks. Given an intermediate feature map, our module sequentially infers attention maps along two separate dimensions,
Sanghyun Woo+3 more
semanticscholar +1 more source
Image Super-Resolution Using Very Deep Residual Channel Attention Networks
European Conference on Computer Vision, 2018Convolutional neural network (CNN) depth is of crucial importance for image super-resolution (SR). However, we observe that deeper networks for image SR are more difficult to train.
Yulun Zhang+5 more
semanticscholar +1 more source
International Conference on Learning Representations, 2017
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their ...
Petar Velickovic+5 more
semanticscholar +1 more source
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their ...
Petar Velickovic+5 more
semanticscholar +1 more source
Attention deficit hyperactivity disorder.
British journal of hospital medicine, 2018Attention deficit hyperactivity disorder is a highly heritable medical condition which mainly affects school-aged children although it is increasingly being recognized in adults.
Muhammed Ather, G. Salmon
semanticscholar +1 more source