Results 301 to 310 of about 557,874 (331)
Some of the next articles are maybe not open access.

Markov and Semi-Markov Processes

2018
This chapter is devoted to jump Markov processes and finite semi-Markov processes. In both cases, the index is considered as the calender time, continuously counted over the positive real line. Markov processes are continuous-time processes that share the Markov property with the discrete-time Markov chains.
Valérie Girardin, Nikolaos Limnios
openaire   +2 more sources

Markov Functionals of an Ergodic Markov Process

Theory of Probability & Its Applications, 1995
Let \((X(t), t \geq 0)\) be a homogeneous Markov process. The author calls a random process \(\xi(t)\) the Markovian functional if the pair \((X(t), \xi(t))\) is a homogeneous Markov process. Let \(\xi_n (t)\) be a sequence of Markovian functionals with finite state space \(I = \{1,2,\dots, d\}\) for all \(n \geq 1\) and such that the following ...
openaire   +3 more sources

Markov Processes and Markov Families

2012
In this section we shall use intuitive arguments in order to find the distribution of M T . Rigorous arguments will be provided later in this chapter, after we introduce the notion of a strong Markov family. Thus, the problem at hand may serve as a simple example motivating the study of the strong Markov property.
Leonid Koralov, Yakov G. Sinai
openaire   +2 more sources

In situ learning using intrinsic memristor variability via Markov chain Monte Carlo sampling

Nature Electronics, 2021
Thomas Dalgaty   +2 more
exaly  

Markov Processes

2004
Scott L. Miller, Donald Childers
openaire   +3 more sources

Markov State Models: From an Art to a Science

Journal of the American Chemical Society, 2018
Brooke E Husic, Vijay S Pande
exaly  

Multi-scenario simulation of urban land change in Shanghai by random forest and CA-Markov model

Sustainable Cities and Society, 2020
Xuewei Dang, Shaohua Wang
exaly  

Home - About - Disclaimer - Privacy