Results 31 to 40 of about 21,160 (284)
Generalizing with perceptrons in case of structured phase- and pattern-spaces
We investigate the influence of different kinds of structure on the learning behaviour of a perceptron performing a classification task defined by a teacher rule. The underlying pattern distribution is permitted to have spatial correlations.
Anlauf J K +24 more
core +2 more sources
For the widely used multilayer perceptrons (MLPs), there exist singularities in the parameter space where Fisher information matrix (FIM) degenerates on these subspaces.
Weili Guo, Guangyu Li, Jianfeng Lu
doaj +1 more source
Error correcting code using tree-like multilayer perceptron
An error correcting code using a tree-like multilayer perceptron is proposed. An original message $\mbi{s}^0$ is encoded into a codeword $\boldmath{y}_0$ using a tree-like committee machine (committee tree) or a tree-like parity machine (parity tree ...
A. Engel +8 more
core +1 more source
On the capabilities of multilayer perceptrons
What is the smallest multilayer perceptron able to compute arbitrary and random functions? Previous results show that a net with one hidden layer containing N-1 threshold units is capable of implementing an arbitrary dichotomy of N points. A construction is presented here for implementing an arbitrary dichotomy with one hidden layer containing \(\lceil
openaire +2 more sources
The ability to produce high-resolution climate maps is crucial for assessing climate change impacts and mitigating climate disasters and risks in developing countries.
Yue Han +2 more
doaj +1 more source
Properties of an invariant set of weights of perceptrons [PDF]
In this paper, the dynamics of weights of perceptrons are investigated based on the perceptron training algorithm. In particular, the condition that the system map is not injective is derived. Based on the derived condition, an invariant set that results
Ho, Charlotte Yuk-Fan +4 more
core +1 more source
Artificial Neural Networks(ANN) has been phenomenally successful on various pattern recognition tasks. However, the design of neural networks rely heavily on the experience and intuitions of individual developers. In this article, the author introduces a mathematical structure called MLP algebra on the set of all Multilayer Perceptron Neural Networks ...
openaire +2 more sources
Sinusoidal Approximation Theorem for Kolmogorov–Arnold Networks
The Kolmogorov–Arnold representation theorem states that any continuous multivariable function can be exactly represented as a finite superposition of continuous single-variable functions.
Sergei Gleyzer +3 more
doaj +1 more source
Human-Centric AI for Trustworthy IoT Systems With Explainable Multilayer Perceptrons
Internet of Things (IoT) widely use analysis of data with artificial intelligence (AI) techniques in order to learn from user actions, support decisions, track relevant aspects of the user, and notify certain events when appropriate.
Ivan Garcia-Magarino +2 more
doaj +1 more source
Statistical mechanics of lossy compression using multilayer perceptrons
Statistical mechanics is applied to lossy compression using multilayer perceptrons for unbiased Boolean messages. We utilize a tree-like committee machine (committee tree) and tree-like parity machine (parity tree) whose transfer functions are monotonic.
C. E. Shannon +5 more
core +1 more source

