Results 311 to 320 of about 460,152 (328)
Some of the next articles are maybe not open access.
1993
Publisher Summary This chapter focuses on Markov chains of the probability theory. It presents a stochastic process {Xn, n = 0, 1, 2…} that takes on a finite or countable number of possible values. This set of possible values of the process is denoted by the set of nonnegative integers {0, 1, 2…}.
openaire +2 more sources
Publisher Summary This chapter focuses on Markov chains of the probability theory. It presents a stochastic process {Xn, n = 0, 1, 2…} that takes on a finite or countable number of possible values. This set of possible values of the process is denoted by the set of nonnegative integers {0, 1, 2…}.
openaire +2 more sources
2005
Abstract Useful models of the real world have to satisfy two conflicting requirements: they must be sufficiently complicated to describe complex systems, but they must also be sufficiently simple for us to analyse them. This chapter introduces Markov chains, which have successfully modelled a huge range of scientific and social phenomena,
openaire +1 more source
Abstract Useful models of the real world have to satisfy two conflicting requirements: they must be sufficiently complicated to describe complex systems, but they must also be sufficiently simple for us to analyse them. This chapter introduces Markov chains, which have successfully modelled a huge range of scientific and social phenomena,
openaire +1 more source
Nature Methods, 2019
You can look back there to explain things, but the explanation disappears. You’ll never find it there. Things are not explained by the past. They’re explained by what happens now.
Naomi Altman+2 more
openaire +2 more sources
You can look back there to explain things, but the explanation disappears. You’ll never find it there. Things are not explained by the past. They’re explained by what happens now.
Naomi Altman+2 more
openaire +2 more sources
1992
Abstract In Section 7.3, we briefly introduced models for Markov chains in the simple case where there were only two possible responses: an event occurs or not. However, such models have much wider application. In continuous time models, where each subject is in one of several possible states at any given point, they are often called ...
openaire +1 more source
Abstract In Section 7.3, we briefly introduced models for Markov chains in the simple case where there were only two possible responses: an event occurs or not. However, such models have much wider application. In continuous time models, where each subject is in one of several possible states at any given point, they are often called ...
openaire +1 more source
1968
Publisher Summary This chapter focuses on Markov chains. A discrete time Markov chain {Xn} is a Markov stochastic process whose state space is a countable or finite set, and for which T = (0, 1, 2, …). When one-step transition probabilities are independent of the time variable, that is, of the value of n, it is said that the Markov process has ...
SAMUEL KARLIN, HOWARD M. TAYLOR
openaire +3 more sources
Publisher Summary This chapter focuses on Markov chains. A discrete time Markov chain {Xn} is a Markov stochastic process whose state space is a countable or finite set, and for which T = (0, 1, 2, …). When one-step transition probabilities are independent of the time variable, that is, of the value of n, it is said that the Markov process has ...
SAMUEL KARLIN, HOWARD M. TAYLOR
openaire +3 more sources
Markov Chains and Monte Carlo Markov Chains
2013The theory of Markov chains is rooted in the work of Russian mathematician Andrey Markov, and has an extensive body of literature to establish its mathematical foundations. The availability of computing resources has recently made it possible to use Markov chains to analyze a variety of scientific data, and Monte Carlo Markov chains are now one of the ...
openaire +2 more sources
Markov chains and Markov chain Monte Carlo methods
2022Treballs Finals de Grau de Matemàtiques, Facultat de Matemàtiques, Universitat de Barcelona, Any: 2022, Director: Carles Rovira ...
openaire +1 more source
On Markov chains and filtrations [PDF]
In this paper we rederive some well known results for continuous time Markov processes that live on a finite state space.Martingale techniques are used throughout the paper. Special attention is paid to the construction of a continuous timeMarkov process, when we start from a discrete time Markov chain.
openaire +2 more sources
2016
The goal of this project is to analyze and know in detail the Markov chains, a stochastic process characterized by the fact that the result obtained in a given stage only depends on the result obtained in the previous stage. We will study a concrete type of Markov chains, these are the discrete chains.
openaire +1 more source
The goal of this project is to analyze and know in detail the Markov chains, a stochastic process characterized by the fact that the result obtained in a given stage only depends on the result obtained in the previous stage. We will study a concrete type of Markov chains, these are the discrete chains.
openaire +1 more source
1990
Markovian processes (semi-Markov and Markov) are processes included in a wider class of processes where one has an explicit time dependence (the dynamic aspect) as well as the stochastic character of the states evolution (therefore probabilistic). They are part of dynamic probabilistic systems.
openaire +2 more sources
Markovian processes (semi-Markov and Markov) are processes included in a wider class of processes where one has an explicit time dependence (the dynamic aspect) as well as the stochastic character of the states evolution (therefore probabilistic). They are part of dynamic probabilistic systems.
openaire +2 more sources