Results 241 to 250 of about 129,225 (287)
Some of the next articles are maybe not open access.

Adaptability of the backpropagation procedure

IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339), 2003
Possible paradigms for concept learning by feedforward neural networks include discrimination and recognition. An interesting aspect of this dichotomy is that the recognition-based implementation can learn certain domains much more efficiently than the discrimination-based one, despite the close structural relationship between the two systems.
Nathalie Japkowicz, Stephen Jose Hanson
openaire   +1 more source

The complex backpropagation algorithm

IEEE Transactions on Signal Processing, 1991
The backpropagation (BP) algorithm that provides a popular method for the design of a multilayer neural network to include complex coefficients and complex signals so that it can be applied to general radar signal processing and communications problems. It is shown that the network can classify complex signals. The generalization of the BP to deal with
Henry Leung, Simon Haykin 0001
openaire   +1 more source

Backpropagation

Kybernetes, 2001
The popular backpropagation algorithm for training neural nets is a special case of an earlier principle of significance feedback, which in turn has much in common with Selfridge’s “Pandemonium” and a connection with McCulloch’s “redundancy of potential command”.
openaire   +1 more source

Asymptotic Convergence of Backpropagation

Neural Computation, 1989
We calculate analytically the rate of convergence at long times in the backpropagation learning algorithm for networks with and without hidden units. For networks without hidden units using the standard quadratic error function and a sigmoidal transfer function, we find that the error decreases as 1/t for large t, and the output states approach their ...
Gerald Tesauro, Yu He, Subutai Ahmad
openaire   +1 more source

On the complex backpropagation algorithm

IEEE Transactions on Signal Processing, 1992
A recursive algorithm for updating the coefficients of a neural network structure for complex signals is presented. Various complex activation functions are considered and a practical definition is proposed. The method, associated to a mean-square-error criterion, yields the complex form of the conventional backpropagation algorithm. >
BENVENUTO, NEVIO, F. Piazza
openaire   +1 more source

Neocognitron learning by backpropagation

Systems and Computers in Japan, 1995
AbstractIn the neural network for pattern recognition, when the selectivity of the feature‐extracting cell is lowered to enhance the generalizing power, a tendency is produced that patterns with similar shapes but belonging to different categories are confused.
Michihiro Ohno   +2 more
openaire   +1 more source

Backpropagation for Parametric STL

2019 IEEE Intelligent Vehicles Symposium (IV), 2019
This paper proposes a method to evaluate Signal Temporal Logic (STL) robustness formulas using computation graphs. This method results in efficient computations and enables the use of backpropagation for optimizing over STL parameters. Inferring STL formulas from behavior traces can provide powerful insights into complex systems, such as longterm ...
Karen Leung   +2 more
openaire   +1 more source

Contrast enhancement for backpropagation

IEEE Transactions on Neural Networks, 1996
This paper analyzes the effect of data-contrast to a backpropagation (BP) network and introduces a data preprocessing algorithm that can improve the efficiency of the standard BP learning. The basic idea is to transform input data to a range that associates the high-slope region of the sigmoid function where a relatively large modification of weights ...
Taek Mu Kwon, Hui Cheng
openaire   +2 more sources

Backpropagation: past and future

IEEE International Conference on Neural Networks, 1988
Some scientists have concluded that backpropagation is a specialized method for pattern classification, of little relevance to broader problems, to parallel computing, or to our understanding of the human brain. The author questions these beliefs and proposes development of a general theory of intelligence in which backpropagation and comparisons to ...
openaire   +1 more source

On Combining Backpropagation with Boosting

The 2006 IEEE International Joint Conference on Neural Network Proceedings, 2006
Boosting is a method for learning combined classifiers. In a boosting ensemble of classifiers trained by the backpropagation algorithm, the learning rate takes much smaller value comparing with the backpropagation applied alone. We propose a method which overcomes the above drawback and test it on neuro-fuzzy systems constituting a classifier ensemble ...
Marcin Korytkowski   +2 more
openaire   +1 more source

Home - About - Disclaimer - Privacy