<Tr> <Th> Discrete - time </Th> <Td> (discrete - time) Markov chain on a countable or finite state space </Td> <Td> Harris chain (Markov chain on a general state space) </Td> </Tr> <Tr> <Th> Continuous - time </Th> <Td> Continuous - time Markov process or Markov jump process </Td> <Td> Any continuous stochastic process with the Markov property, e.g., the Wiener process </Td> </Tr> <P> Note that there is no definitive agreement in the literature on the use of some of the terms that signify special cases of Markov processes . Usually the term "Markov chain" is reserved for a process with a discrete set of times, i.e. a discrete - time Markov chain (DTMC), but a few authors use the term "Markov process" to refer to a continuous - time Markov chain (CTMC) without explicit mention . In addition, there are other extensions of Markov processes that are referred to as such but do not necessarily fall within any of these four categories (see Markov model). Moreover, the time index need not necessarily be real - valued; like with the state space, there are conceivable processes that move through index sets with other mathematical constructs . Notice that the general state space continuous - time Markov chain is general to such a degree that it has no designated term . </P> <P> While the time parameter is usually discrete, the state space of a Markov chain does not have any generally agreed - on restrictions: the term may refer to a process on an arbitrary state space . However, many applications of Markov chains employ finite or countably infinite state spaces, which have a more straightforward statistical analysis . Besides time - index and state - space parameters, there are many other variations, extensions and generalizations (see Variations). For simplicity, most of this article concentrates on the discrete - time, discrete state - space case, unless mentioned otherwise . </P>

Markov chains with finite and countable state space