Results 51 to 60 of about 324,009 (309)

BA-Net: Bridge Attention for Deep Convolutional Neural Networks

open access: yes, 2022
In recent years, channel attention mechanism has been widely investigated due to its great potential in improving the performance of deep convolutional neural networks (CNNs) in many vision tasks. However, in most of the existing methods, only the output of the adjacent convolution layer is fed into the attention layer for calculating the channel ...
Yue Zhao   +3 more
openaire   +2 more sources

BitFlow-Net: Toward Fully Binarized Convolutional Neural Networks [PDF]

open access: yesIEEE Access, 2019
Binarization can greatly compress and accelerate deep convolutional neural networks (CNNs) for real-time industrial applications. However, existing binarized CNNs (BCNNs) rely on scaling factor (SF) and batch normalization (BatchNorm) that still involve resource-consuming floating-point multiplication operations.
Lijun Wu   +6 more
openaire   +3 more sources

DeepID-Net: Deformable deep convolutional neural networks for object detection [PDF]

open access: yes2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2015
In this paper, we propose deformable deep convolutional neural networks for generic object detection. This new deep learning object detection framework has innovations in multiple aspects. In the proposed new deep architecture, a new deformation constrained pooling (def-pooling) layer models the deformation of object parts with geometric constraint and
Shuo Yang   +10 more
openaire   +3 more sources

Novel multi‐domain attention for abstractive summarisation

open access: yesCAAI Transactions on Intelligence Technology, 2023
The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary, and the summary generated by models lacks the cover of the subject of source document due to models ...
Chunxia Qu   +4 more
doaj   +1 more source

A deep LSTM‐CNN based on self‐attention mechanism with input data reduction for short‐term load forecasting

open access: yesIET Generation, Transmission & Distribution, 2023
Numerous studies on short‐term load forecasting (STLF) have used feature extraction methods to increase the model's accuracy by incorporating multidimensional features containing time, weather and distance information.
Shiyan Yi   +4 more
doaj   +1 more source

S-Net: a scalable convolutional neural network for JPEG compression artifact reduction [PDF]

open access: yesJournal of Electronic Imaging, 2018
accepted by Journal of Electronic ...
Yaowu Chen   +3 more
openaire   +4 more sources

Monocular Object Instance Segmentation and Depth Ordering with CNNs

open access: yes, 2015
In this paper we tackle the problem of instance-level segmentation and depth ordering from a single monocular image. Towards this goal, we take advantage of convolutional neural nets and train them to directly predict instance-level segmentations where ...
Fidler, Sanja   +3 more
core   +1 more source

Training Behavior of Sparse Neural Network Topologies

open access: yes, 2019
Improvements in the performance of deep neural networks have often come through the design of larger and more complex networks. As a result, fast memory is a significant limiting factor in our ability to improve network performance.
Alford, Simon   +3 more
core   +1 more source

Brain inspired neuronal silencing mechanism to enable reliable sequence identification

open access: yesScientific Reports, 2022
Real-time sequence identification is a core use-case of artificial neural networks (ANNs), ranging from recognizing temporal events to identifying verification codes.
Shiri Hodassman   +7 more
doaj   +1 more source

Preconditioned Stochastic Gradient Langevin Dynamics for Deep Neural Networks

open access: yes, 2015
Effective training of deep neural networks suffers from two main issues. The first is that the parameter spaces of these models exhibit pathological curvature.
Carin, Lawrence   +3 more
core   +1 more source

Home - About - Disclaimer - Privacy