Results 41 to 50 of about 127,617 (265)

Cellular and Network Mechanisms for Temporal Signal Propagation in a Cortical Network Model

open access: yesFrontiers in Computational Neuroscience, 2019
The mechanisms underlying an effective propagation of high intensity information over a background of irregular firing and response latency in cognitive processes remain unclear. Here we propose a SSCCPI circuit to address this issue. We hypothesize that
Zonglu He
doaj   +1 more source

Research on three-step accelerated gradient algorithm in deep learning

open access: yesStatistical Theory and Related Fields, 2022
Gradient descent (GD) algorithm is the widely used optimisation method in training machine learning and deep learning models. In this paper, based on GD, Polyak's momentum (PM), and Nesterov accelerated gradient (NAG), we give the convergence of the ...
Yongqiang Lian   +2 more
doaj   +1 more source

Recurrent backpropagation and the dynamical approach to adaptive neural computation [PDF]

open access: yes, 1989
Error backpropagation in feedforward neural network models is a popular learning algorithm that has its roots in nonlinear estimation and optimization.
Pineda, Fernando J.
core  

Spectrally Tunable 2D Material‐Based Infrared Photodetectors for Intelligent Optoelectronics

open access: yesAdvanced Functional Materials, EarlyView.
Intelligent optoelectronics through spectral engineering of 2D material‐based infrared photodetectors. Abstract The evolution of intelligent optoelectronic systems is driven by artificial intelligence (AI). However, their practical realization hinges on the ability to dynamically capture and process optical signals across a broad infrared (IR) spectrum.
Junheon Ha   +18 more
wiley   +1 more source

Backward Signal Propagation: A Symmetry-Based Training Method for Neural Networks

open access: yesAlgorithms
While backpropagation (BP) has long served as the cornerstone of training deep neural networks, it relies heavily on strict differentiation logic and global gradient information, lacking biological plausibility. In this paper, we systematically present a
Kun Jiang, Zhihong Fu
doaj   +1 more source

Integration Method with Backpropagation [PDF]

open access: yesAl-Rafidain Journal of Computer Sciences and Mathematics, 2005
In this research, a new method is discovered (combined method) to accelerate the backpropagation network by using the expected values of source units for updating weights, we mean the expected value of unit by the sum of the output of the unit and its ...
Nidhal AL-Assady   +2 more
doaj   +1 more source

Backpropagation training in adaptive quantum networks

open access: yes, 2009
We introduce a robust, error-tolerant adaptive training algorithm for generalized learning paradigms in high-dimensional superposed quantum networks, or \emph{adaptive quantum networks}.
A. Ferreira   +17 more
core   +1 more source

Structure–Transport–Ion Retention Coupling for Enhanced Nonvolatile Artificial Synapses

open access: yesAdvanced Functional Materials, EarlyView.
Nitrogen incorporation into the conjugated backbone of donor–acceptor polymers enables efficient charge transfer and deep ion embedding in organic electrochemical synaptic transistors (OESTs). This molecular‐level design enhances non‐volatile synaptic properties, providing a new strategy for developing high‐performance and reliable neuromorphic devices.
Donghwa Lee   +5 more
wiley   +1 more source

Recomposable Layered Metasurfaces for Wavelength‐Multiplexed Optical Encryption via Modular Diffractive Deep Neural Networks

open access: yesAdvanced Functional Materials, EarlyView.
Modular diffractive deep neural network metasurfaces encode and reconstruct holograms across layer combinations and wavelengths, enabling secure, multifunctional operation. Each layer acts independently yet composes jointly, yielding up to m(2N −1) channels for m wavelengths and N layers.
Cherry Park   +4 more
wiley   +1 more source

Towards New Generation, Biologically Plausible Deep Neural Network Learning

open access: yesSci, 2022
Artificial neural networks in their various different forms convincingly dominate machine learning of the present day. Nevertheless, the manner in which these networks are trained, in particular by using end-to-end backpropagation, presents a major ...
Anirudh Apparaju, Ognjen Arandjelović
doaj   +1 more source

Home - About - Disclaimer - Privacy