Results 241 to 250 of about 106,956 (270)
Time course of functional and structural brain network changes after mild traumatic brain injury. [PDF]
Kim E, Seo HG, Yoo RE, Oh BM.
europepmc +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Typologies of attentional networks
Nature Reviews Neuroscience, 2006Attention is a central theme in cognitive science - it exemplifies the links between the brain and behaviour, and binds psychology to the techniques of neuroscience. A visionary model suggested by Michael Posner described attention as a set of independent control networks. This challenged the previously held view of attention as a uniform concept.
Amir, Raz, Jason, Buhle
openaire +4 more sources
Trends in Neurosciences, 1994
Recent brain-imaging and neurophysiological data indicate that attention is neither a property of a single brain area, nor of the entire brain. While attentional effects seem mediated by a relative amplification of blood flow and electrical activity in the cortical areas processing the attended computation, the details of how this is done through ...
M I, Posner, S, Dehaene
openaire +2 more sources
Recent brain-imaging and neurophysiological data indicate that attention is neither a property of a single brain area, nor of the entire brain. While attentional effects seem mediated by a relative amplification of blood flow and electrical activity in the cortical areas processing the attended computation, the details of how this is done through ...
M I, Posner, S, Dehaene
openaire +2 more sources
Attention in Attention Networks for Person Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021This paper generalizes the Attention in Attention (AiA) mechanism, in P. Fang et al., 2019 by employing explicit mapping in reproducing kernel Hilbert spaces to generate attention values of the input feature map. The AiA mechanism models the capacity of building inter-dependencies among the local and global features by the interaction of inner and ...
Pengfei, Fang +5 more
openaire +2 more sources
Psychiatrische Praxis, 2004
In recent years it has been possible to treat attention as an organ system with its own anatomy, circuitry and set of functions. We view attention in terms of three interrelated neural networks in the human brain. These networks carry out the specific functions of developing and maintaining the alert state, orienting to sensory input, and executive ...
Jin, Fan, Michael, Posner
openaire +2 more sources
In recent years it has been possible to treat attention as an organ system with its own anatomy, circuitry and set of functions. We view attention in terms of three interrelated neural networks in the human brain. These networks carry out the specific functions of developing and maintaining the alert state, orienting to sensory input, and executive ...
Jin, Fan, Michael, Posner
openaire +2 more sources
Guided Attention Inference Network
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020With only coarse labels, weakly supervised learning typically uses top-down attention maps generated by back-propagating gradients as priors for tasks such as object localization and semantic segmentation. While these attention maps are intuitive and informative explanations of deep neural network, there is no effective mechanism to manipulate the ...
Kunpeng Li +4 more
openaire +2 more sources
2020 IEEE 19th International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), 2020
Recently, graph neural networks have achieved great success on the representation learning of the graph-structured data. However, these networks just consider the pairwise connection between nodes which cannot model the complicated connections of data in the real world. Thus, researchers began to pay attention to the hypergraph modeling.
Chaofan Chen +3 more
openaire +1 more source
Recently, graph neural networks have achieved great success on the representation learning of the graph-structured data. However, these networks just consider the pairwise connection between nodes which cannot model the complicated connections of data in the real world. Thus, researchers began to pay attention to the hypergraph modeling.
Chaofan Chen +3 more
openaire +1 more source
2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2019
Multi-band images beyond RGB are becoming popular in both commercial applications and research datasets, yet existing deep learning models were designed for academic RGB datasets. In this talk, we propose Channel Attention Networks (CAN), a deep learning model that uses soft attention on individual channels.
Alexei A. Bastidas, Hanlin Tang
openaire +1 more source
Multi-band images beyond RGB are becoming popular in both commercial applications and research datasets, yet existing deep learning models were designed for academic RGB datasets. In this talk, we propose Channel Attention Networks (CAN), a deep learning model that uses soft attention on individual channels.
Alexei A. Bastidas, Hanlin Tang
openaire +1 more source
Pathologies of brain attentional networks
Neuroscience & Biobehavioral Reviews, 2000In the last decade, it has been possible to trace the areas of the human brain involved in a variety of cognitive and emotional processes by use of imaging technology. Brain networks that subserve attention have been described. It is now possible to use these networks as model systems for the exploration of symptoms arising from various forms of ...
A, Berger, M I, Posner
openaire +2 more sources
Adaptive modes of attention: Evidence from attentional networks
CortexPosner and Petersen (1990) suggested that the attention system is composed of three networks: alerting, orienting, and executive functioning or control. Drawing on this theory, the Attentional Networks Test (ANT) was designed to quantify the functionality of the three attention networks.
Omer Linkovski +4 more
openaire +2 more sources

