Results 1 to 10 of about 3,042,107 (189)

A General Survey on Attention Mechanisms in Deep Learning [PDF]

open access: yesIEEE Transactions on Knowledge and Data Engineering (TKDE), 2021, 2022
Attention is an important mechanism that can be employed for a variety of deep learning models across many different domains and tasks. This survey provides an overview of the most important attention mechanisms proposed in the literature. The various attention mechanisms are explained by means of a framework consisting of a general attention model ...
arxiv   +1 more source

Attention Mechanisms in Computer Vision: A Survey [PDF]

open access: yesComputational Visual Media, 2022, Vol. 8, No. 3, 331-368, 2021
Humans can naturally and effectively find salient regions in complex scenes. Motivated by this observation, attention mechanisms were introduced into computer vision with the aim of imitating this aspect of the human visual system. Such an attention mechanism can be regarded as a dynamic weight adjustment process based on features of the input image ...
arxiv   +1 more source

Tri-Attention: Explicit Context-Aware Attention Mechanism for Natural Language Processing [PDF]

open access: yesarXiv, 2022
In natural language processing (NLP), the context of a word or sentence plays an essential role. Contextual information such as the semantic representation of a passage or historical dialogue forms an essential part of a conversation and a precise understanding of the present phrase or sentence.
arxiv  

Attention mechanisms for physiological signal deep learning: which attention should we take? [PDF]

open access: yesarXiv, 2022
Attention mechanisms are widely used to dramatically improve deep learning model performance in various fields. However, their general ability to improve the performance of physiological signal deep learning model is immature. In this study, we experimentally analyze four attention mechanisms (e.g., squeeze-and-excitation, non-local, convolutional ...
arxiv  

Attention in Attention Network for Image Super-Resolution [PDF]

open access: yesarXiv, 2021
Convolutional neural networks have allowed remarkable advances in single image super-resolution (SISR) over the last decade. Among recent advances in SISR, attention mechanisms are crucial for high-performance SR models. However, the attention mechanism remains unclear on why and how it works in SISR.
arxiv  

Adaptive Sparse and Monotonic Attention for Transformer-based Automatic Speech Recognition [PDF]

open access: yesarXiv, 2022
The Transformer architecture model, based on self-attention and multi-head attention, has achieved remarkable success in offline end-to-end Automatic Speech Recognition (ASR). However, self-attention and multi-head attention cannot be easily applied for streaming or online ASR.
arxiv  

Explaining the Attention Mechanism of End-to-End Speech Recognition Using Decision Trees [PDF]

open access: yesarXiv, 2021
The attention mechanism has largely improved the performance of end-to-end speech recognition systems. However, the underlying behaviours of attention is not yet clearer. In this study, we use decision trees to explain how the attention mechanism impact itself in speech recognition.
arxiv  

HAR-Net: Joint Learning of Hybrid Attention for Single-stage Object Detection [PDF]

open access: yes, 2019
Object detection has been a challenging task in computer vision. Although significant progress has been made in object detection with deep neural networks, the attention mechanism is far from development. In this paper, we propose the hybrid attention mechanism for single-stage object detection.
arxiv   +1 more source

Linear Attention Mechanism: An Efficient Attention for Semantic Segmentation [PDF]

open access: yesarXiv, 2020
In this paper, to remedy this deficiency, we propose a Linear Attention Mechanism which is approximate to dot-product attention with much less memory and computational costs. The efficient design makes the incorporation between attention mechanisms and neural networks more flexible and versatile.
arxiv  

Improving Speech Emotion Recognition Through Focus and Calibration Attention Mechanisms [PDF]

open access: yesarXiv, 2022
Attention has become one of the most commonly used mechanisms in deep learning approaches. The attention mechanism can help the system focus more on the feature space's critical regions. For example, high amplitude regions can play an important role for Speech Emotion Recognition (SER). In this paper, we identify misalignments between the attention and
arxiv  

Home - About - Disclaimer - Privacy