Results 11 to 20 of about 806,463 (274)
Training deep quantum neural networks [PDF]
It is hard to design quantum neural networks able to work with quantum data. Here, the authors propose a noise-robust architecture for a feedforward quantum neural network, with qudits as neurons and arbitrary unitary operations as perceptrons, whose ...
Kerstin Beer +6 more
doaj +4 more sources
Evolutional deep neural network [PDF]
The notion of an Evolutional Deep Neural Network (EDNN) is introduced for the solution of partial differential equations (PDE). The parameters of the network are trained to represent the initial state of the system only, and are subsequently updated dynamically, without any further training, to provide an accurate prediction of the evolution of the PDE
Yifan Du, Tamer A. Zaki
openaire +3 more sources
Orthogonal Deep Neural Networks [PDF]
In this paper, we introduce the algorithms of Orthogonal Deep Neural Networks (OrthDNNs) to connect with recent interest of spectrally regularized deep learning methods. OrthDNNs are theoretically motivated by generalization analysis of modern DNNs, with the aim to find solution properties of network weights that guarantee better generalization.
Shuai Li +4 more
openaire +3 more sources
Tweaking Deep Neural Networks [PDF]
Deep neural networks are trained so as to achieve a kind of the maximum overall accuracy through a learning process using given training data. Therefore, it is difficult to fix them to improve the accuracies of specific problematic classes or classes of interest that may be valuable to some users or applications.
Kim, Jinwook +2 more
openaire +3 more sources
Deep Polynomial Neural Networks [PDF]
Deep Convolutional Neural Networks (DCNNs) are currently the method of choice both for generative, as well as for discriminative learning in computer vision and machine learning. The success of DCNNs can be attributed to the careful selection of their building blocks (e.g., residual blocks, rectifiers, sophisticated normalization schemes, to mention ...
Chrysos, G.G. +5 more
openaire +4 more sources
AbstractThere are many articles teaching people how to build intelligent applications using different frameworks such as TensorFlow, PyTorch, etc. However, except those very professional research papers, very few articles can give us a comprehensive understanding on how to develop such frameworks.
Liang Wang, Jianxin Zhao
+4 more sources
Deep Neural Networks for Network Routing [PDF]
In this work, we propose a Deep Learning (DL) based solution to the problem of routing traffic flows in computer networks. Routing decisions can be made in different ways depending on the desired objective and, based on that objective function, optimal solutions can be computed using a variety of techniques, e.g.
Reis, João +5 more
openaire +2 more sources
Statistical physics of deep neural networks: Initialization toward optimal channels
In deep learning, neural networks serve as noisy channels between input data and its latent representation. This perspective naturally relates deep learning with the pursuit of constructing channels with optimal performance in information transmission ...
Kangyu Weng +4 more
doaj +1 more source
Fast Sparse Deep Neural Networks: Theory and Performance Analysis
In this paper, fast sparse deep neural networks that aim to offer an alternative way of learning in a deep structure are proposed. We examine some optimization algorithms for traditional deep neural networks and find that deep neural networks suffer from
Jin Zhao, Licheng Jiao
doaj +1 more source
Evolving Deep Neural Networks [PDF]
The success of deep learning depends on finding an architecture to fit the task. As deep learning has scaled up to more challenging tasks, the architectures have become difficult to design by hand. This paper proposes an automated method, CoDeepNEAT, for optimizing deep learning architectures through evolution.
Miikkulainen, Risto +10 more
openaire +2 more sources

