Results 51 to 60 of about 4,980,537 (374)
In this article, we investigate the Wong-Zakai approximations of a class of second order non-autonomous stochastic lattice systems with additive white noise.
Xintao Li
doaj +1 more source
In this paper, we first establish some sufficient conditions for the existence and construction of a random exponential attractor for a continuous cocycle on a separable Banach space.
Zhaojuan Wang, Shengfan Zhou
semanticscholar +1 more source
Periodic attractors of random truncator maps [PDF]
8 pages, presented at ...
Ted Theodosopoulos, Robert P. Boyer
openaire +3 more sources
In this paper, we consider the backward asymptotically autonomous dynamical behavior for fractional non-autonomous nonclassical diffusion equations driven by a Wong–Zakai approximations process in Hs(Rn) with s∈(0,1).
Hong Li, Fuzhi Li
doaj +1 more source
Additive noise destroys the random attractor close to bifurcation [PDF]
We provide an example for stabilization by noise. Due to the presence of higher order differential operators our approach does not rely on monotonicity arguments, i.e. the preserved order of solutions.
L. Bianchi, D. Blömker, Meihua Yang
semanticscholar +1 more source
In this paper, we consider the asymptotic behavior of solutions to stochastic strongly damped wave equations with variable delays on unbounded domains, which is driven by both additive noise and deterministic non-autonomous forcing.
Li Yang
doaj +1 more source
This paper investigates the existence of random attractor for stochastic Boussinesq equations driven by multiplicative white noises in both the velocity and temperature equations and estimates the Hausdorff dimension of the random attractor.
Yin Li, Ruiying Wei, Donghong Cai
doaj +1 more source
This paper deals with the asymptotic behavior of solutions to non-autonomous, fractional, stochastic p-Laplacian equations driven by additive white noise and random terms defined on the unbounded domain ℝN.
Renhai Wang, Bixiang Wang
doaj +1 more source
Data‐driven performance metrics for neural network learning
Summary Effectiveness of data‐driven neural learning in terms of both local mimima trapping and convergence rate is addressed. Such issues are investigated in a case study involving the training of one‐hidden‐layer feedforward neural networks with the extended Kalman filter, which reduces the search for the optimal network parameters to a state ...
Angelo Alessandri+2 more
wiley +1 more source