Results 321 to 330 of about 2,583,540 (350)
Some of the next articles are maybe not open access.

Fuzzy Markov chains

Proceedings Joint 9th IFSA World Congress and 20th NAFIPS International Conference (Cat. No. 01TH8569), 2002
We first review some of the basic results of finite Markov chains based on probability theory, then we present fuzzy finite Markov chains based on possibility theory, and compare the results of the two theories. Then we introduce finite horizon Markovian decision processes based on fuzzy Markov chains and study an example in detail showing our solution
Yoichi Hayashi   +2 more
openaire   +2 more sources

Markov chains

2001
Abstract Show that any sequence of independent random variables taking values in the countable set S is a Markov chain. Under what condition is this chain homogeneous? A die is rolled repeatedly. Which of the following are Markov chains? For those that are, supply the transition matrix.
Geoffrey R Grimmett, David R Stirzaker
openaire   +2 more sources

Revenue Management Under the Markov Chain Choice Model

Operational Research, 2017
We consider revenue management problems when customers choose among the offered products according to the Markov chain choice model. In this choice model, a customer arrives into the system to purchase a particular product.
Jacob B. Feldman, Huseyin Topaloglu
semanticscholar   +1 more source

Bayesian Annealed Sequential Importance Sampling: An Unbiased Version of Transitional Markov Chain Monte Carlo

, 2018
The transitional Markov chain Monte Carlo (TMCMC) is one of the efficient algorithms for performing Markov chain Monte Carlo (MCMC) in the context of Bayesian uncertainty quantification in parallel...
Stephen Wu   +3 more
semanticscholar   +1 more source

Markov chains and embedded Markov chains in geology

Journal of the International Association for Mathematical Geology, 1969
Geological data are structured as first-order, discrete-state discrete-time Markov chains in two main ways. In one, observations are spaced equally in time or space to yield transition probability matrices with nonzero elements in the main diagonal; in the other, only state transitions are recorded, to yield matrices with diagonal elements exactly ...
W. C. Krumbein, Michael F. Dacey
openaire   +2 more sources

Markov Chains

2006
Introduction to Markov chains and their numerical ...
openaire   +4 more sources

Markov Chains

1993
Publisher Summary This chapter focuses on Markov chains of the probability theory. It presents a stochastic process {Xn, n = 0, 1, 2…} that takes on a finite or countable number of possible values. This set of possible values of the process is denoted by the set of nonnegative integers {0, 1, 2…}.
openaire   +2 more sources

Markov Chains

2005
Abstract Useful models of the real world have to satisfy two conflicting requirements: they must be sufficiently complicated to describe complex systems, but they must also be sufficiently simple for us to analyse them. This chapter introduces Markov chains, which have successfully modelled a huge range of scientific and social phenomena,
openaire   +1 more source

Bayesian computation via the gibbs sampler and related markov chain monte carlo methods (with discus

, 1993
The use of the Gibbs sampler for Bayesian computation is reviewed and illustrated in the context of some canonical examples. Other Markov chain Monte Carlo simulation methods are also briefly described, and comments are made on the advantages of sample ...
Adrian F. M. Smith, G. Roberts
semanticscholar   +1 more source

Markov models—Markov chains

Nature Methods, 2019
You can look back there to explain things, but the explanation disappears. You’ll never find it there. Things are not explained by the past. They’re explained by what happens now.
Naomi Altman   +2 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy