Results 231 to 240 of about 129,225 (287)
Thermodynamic analysis and intelligent modeling of statin drugs solubility in supercritical carbon dioxide. [PDF]
Amani M, Shahrabadi A, Ardestani NS.
europepmc +1 more source
Fine-Pruning: A biologically inspired algorithm for personalization of machine learning models. [PDF]
Bingham J, Zonouz S, Aran D.
europepmc +1 more source
Fast automated adjoints for spectral PDE solvers. [PDF]
Skene CS, Burns KJ.
europepmc +1 more source
Noise-aware training of neuromorphic dynamic device networks. [PDF]
Manneschi L +16 more
europepmc +1 more source
A Study on the Grip Force of Ski Gloves with Feature Data Fusion Based on GWO-BPNN Deep Learning. [PDF]
Ma X, Gao X, Zhang Y, Gao Y.
europepmc +1 more source
Astrocyte-gated multi-timescale plasticity for online continual learning in deep spiking neural networks. [PDF]
Dong Z, He W.
europepmc +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2020
We extend backpropagation (BP) learning from ordinary unidirectional training to bidirectional training of deep multilayer neural networks. This gives a form of backward chaining or inverse inference from an observed network output to a candidate input that produced the output.
Olaoluwa Adigun, Bart Kosko
openaire +1 more source
We extend backpropagation (BP) learning from ordinary unidirectional training to bidirectional training of deep multilayer neural networks. This gives a form of backward chaining or inverse inference from an observed network output to a candidate input that produced the output.
Olaoluwa Adigun, Bart Kosko
openaire +1 more source
Neural Computation, 1993
When training a feedforward neural network with backpropagation (Rumelhart et al. 1986), local minima are always a problem because of the nonlinearity of the system. There have been several ways to attack this problem: for example, to restart the training by selecting a new initial point, to perform the preprocessing of the input data or the neural ...
Liping Yang, Wanzhen Yu
openaire +1 more source
When training a feedforward neural network with backpropagation (Rumelhart et al. 1986), local minima are always a problem because of the nonlinearity of the system. There have been several ways to attack this problem: for example, to restart the training by selecting a new initial point, to perform the preprocessing of the input data or the neural ...
Liping Yang, Wanzhen Yu
openaire +1 more source

