Results 231 to 240 of about 98,858 (265)
Some of the next articles are maybe not open access.
2009
A continuous-time Markov chain (CTMC) is a discrete-time Markov chain with the modification that, instead of spending one time unit in a state, it remains in a state for an exponentially distributed time whose rate depends on the state. The methodology of CTMCs is based on properties of renewal and Poisson processes as well as discrete-time chains ...
openaire +2 more sources
A continuous-time Markov chain (CTMC) is a discrete-time Markov chain with the modification that, instead of spending one time unit in a state, it remains in a state for an exponentially distributed time whose rate depends on the state. The methodology of CTMCs is based on properties of renewal and Poisson processes as well as discrete-time chains ...
openaire +2 more sources
2017
As in discrete time, continuous-time Markov chains are stochastic processes in which the future depends on the past only through the present, or equivalently, given the present, past, and future are independent. Since there is no next time when time is continuous, the process is now characterized by transition rates instead of transition probabilities.
openaire +1 more source
As in discrete time, continuous-time Markov chains are stochastic processes in which the future depends on the past only through the present, or equivalently, given the present, past, and future are independent. Since there is no next time when time is continuous, the process is now characterized by transition rates instead of transition probabilities.
openaire +1 more source
1997
In this chapter, we consider the continuous-time analogs of discrete-time Markov chains. As in the discrete-time case, they are characterized by the Markov property that, given the present state, the future of the process is stochastically independent of the past.
openaire +1 more source
In this chapter, we consider the continuous-time analogs of discrete-time Markov chains. As in the discrete-time case, they are characterized by the Markov property that, given the present state, the future of the process is stochastically independent of the past.
openaire +1 more source
2002
WE TURN now to the continuous time version of the Markov property. Some of the simplicity of Chapter 2 is retained, because we assume the state space S is discrete. Usually we can suppose that S = {0, 1, … }. The succession of states visited still follows a discrete parameter Markov chain but now the flow of time is perturbed by exponentially ...
openaire +1 more source
WE TURN now to the continuous time version of the Markov property. Some of the simplicity of Chapter 2 is retained, because we assume the state space S is discrete. Usually we can suppose that S = {0, 1, … }. The succession of states visited still follows a discrete parameter Markov chain but now the flow of time is perturbed by exponentially ...
openaire +1 more source
2019
The first part of this chapter presented the definition of a continuous-time Markov chain with two properties, and the introduction of a B&D process with some special examples such as homogeneous Poisson process as a pure birth process, and the population model as a B&D process.
openaire +1 more source
The first part of this chapter presented the definition of a continuous-time Markov chain with two properties, and the introduction of a B&D process with some special examples such as homogeneous Poisson process as a pure birth process, and the population model as a B&D process.
openaire +1 more source

