Results 11 to 20 of about 3,042,107 (189)
Fooling Vision and Language Models Despite Localization and Attention Mechanism [PDF]
Adversarial attacks are known to succeed on classifiers, but it has been an open question whether more complex vision systems are vulnerable. In this paper, we study adversarial examples for vision and language models, which incorporate natural language ...
Xu, Xiaojun+5 more
core +4 more sources
Persistence pays off: Paying Attention to What the LSTM Gating Mechanism Persists [PDF]
Language Models (LMs) are important components in several Natural Language Processing systems. Recurrent Neural Network LMs composed of LSTM units, especially those augmented with an external memory, have achieved state-of-the-art results. However, these
Kelleher, John D., Salton, Giancarlo D.
core +3 more sources
Attention mechanisms in the CHREST cognitive architecture [PDF]
In this paper, we describe the attention mechanisms in CHREST, a computational architecture of human visual expertise. CHREST organises information acquired by direct experience from the world in the form of chunks.
A. Newell+27 more
core +1 more source
Speech Emotion Recognition Using Multi-hop Attention Mechanism
In this paper, we are interested in exploiting textual and acoustic data of an utterance for the speech emotion classification task. The baseline approach models the information from audio and text independently using two deep neural networks (DNNs). The
Byun, Seokhyun+3 more
core +1 more source
Neuropsychological evidence for three distinct motion mechanisms [PDF]
Published in final edited form as: Neurosci Lett. 2011 May 16; 495(2): 102–106. doi:10.1016/j.neulet.2011.03.048.We describe psychophysical performance of two stroke patients with lesions in distinct cortical regions in the left hemisphere.
Dumoulin, Serge O., Vaina, Lucia M.
core +1 more source
Satirical News Detection and Analysis using Attention Mechanism and Linguistic Features
Satirical news is considered to be entertainment, but it is potentially deceptive and harmful. Despite the embedded genre in the article, not everyone can recognize the satirical cues and therefore believe the news as true news. We observe that satirical
Dragut, Eduard+2 more
core +1 more source
Most of the Neural Machine Translation (NMT) models are based on the sequence-to-sequence (Seq2Seq) model with an encoder-decoder framework equipped with the attention mechanism.
Li, Muyu+4 more
core +1 more source
The dual nature of TDC – bridging dendritic and T cells in immunity
TDC are hematopoietic cells combining dendritic and T cell features. They reach secondary lymphoid organs (SLOs) and peripheral organs (liver and lungs) after FLT3‐dependent development in the bone marrow and maturation in the thymus. TDC are activated and enriched in SLOs upon viral infection, suggesting that they might play unique immune roles, since
Maria Nelli, Mirela Kuka
wiley +1 more source
Link Prediction with Mutual Attention for Text-Attributed Networks
In this extended abstract, we present an algorithm that learns a similarity measure between documents from the network topology of a structured corpus.
Brochier, Robin+2 more
core +2 more sources
In vivo IL‐10 produced by tissue‐resident tolDC is involved in maintaining/inducing tolerance. Depending on the agent used for ex vivo tolDC generation, cells acquire common features but prime T cells towards anergy, FOXP3+ Tregs, or Tr1 cells according to the levels of IL‐10 produced. Ex vivo‐induced tolDC were administered to patients to re‐establish/
Konstantina Morali+3 more
wiley +1 more source