Results 331 to 340 of about 2,583,540 (350)
Some of the next articles are maybe not open access.
1992
Abstract In Section 7.3, we briefly introduced models for Markov chains in the simple case where there were only two possible responses: an event occurs or not. However, such models have much wider application. In continuous time models, where each subject is in one of several possible states at any given point, they are often called ...
openaire +1 more source
Abstract In Section 7.3, we briefly introduced models for Markov chains in the simple case where there were only two possible responses: an event occurs or not. However, such models have much wider application. In continuous time models, where each subject is in one of several possible states at any given point, they are often called ...
openaire +1 more source
1968
Publisher Summary This chapter focuses on Markov chains. A discrete time Markov chain {Xn} is a Markov stochastic process whose state space is a countable or finite set, and for which T = (0, 1, 2, …). When one-step transition probabilities are independent of the time variable, that is, of the value of n, it is said that the Markov process has ...
SAMUEL KARLIN, HOWARD M. TAYLOR
openaire +3 more sources
Publisher Summary This chapter focuses on Markov chains. A discrete time Markov chain {Xn} is a Markov stochastic process whose state space is a countable or finite set, and for which T = (0, 1, 2, …). When one-step transition probabilities are independent of the time variable, that is, of the value of n, it is said that the Markov process has ...
SAMUEL KARLIN, HOWARD M. TAYLOR
openaire +3 more sources
Markov chains and mixing times
, 2013For our purposes, a Markov chain is a (finite or countable) collection of states S and transition probabilities pij, where i, j ∈ S. We write P = [pij] for the matrix of transition probabilities.
V. Climenhaga
semanticscholar +1 more source
Markov Chains and Monte Carlo Markov Chains
2013The theory of Markov chains is rooted in the work of Russian mathematician Andrey Markov, and has an extensive body of literature to establish its mathematical foundations. The availability of computing resources has recently made it possible to use Markov chains to analyze a variety of scientific data, and Monte Carlo Markov chains are now one of the ...
openaire +2 more sources
Markov chains and Markov chain Monte Carlo methods
2022Treballs Finals de Grau de Matemàtiques, Facultat de Matemàtiques, Universitat de Barcelona, Any: 2022, Director: Carles Rovira ...
openaire +1 more source
On Markov chains and filtrations [PDF]
In this paper we rederive some well known results for continuous time Markov processes that live on a finite state space.Martingale techniques are used throughout the paper. Special attention is paid to the construction of a continuous timeMarkov process, when we start from a discrete time Markov chain.
openaire +2 more sources
2016
The goal of this project is to analyze and know in detail the Markov chains, a stochastic process characterized by the fact that the result obtained in a given stage only depends on the result obtained in the previous stage. We will study a concrete type of Markov chains, these are the discrete chains.
openaire +1 more source
The goal of this project is to analyze and know in detail the Markov chains, a stochastic process characterized by the fact that the result obtained in a given stage only depends on the result obtained in the previous stage. We will study a concrete type of Markov chains, these are the discrete chains.
openaire +1 more source
Markov Chain Sampling Methods for Dirichlet Process Mixture Models
, 2000Radford M. Neal
semanticscholar +1 more source
1990
Markovian processes (semi-Markov and Markov) are processes included in a wider class of processes where one has an explicit time dependence (the dynamic aspect) as well as the stochastic character of the states evolution (therefore probabilistic). They are part of dynamic probabilistic systems.
openaire +2 more sources
Markovian processes (semi-Markov and Markov) are processes included in a wider class of processes where one has an explicit time dependence (the dynamic aspect) as well as the stochastic character of the states evolution (therefore probabilistic). They are part of dynamic probabilistic systems.
openaire +2 more sources
Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference
Technometrics, 2008S. Ahmed
semanticscholar +1 more source