Check Markov chain for ergodicity
Determine Whether Markov Chain Is Ergodic
Consider this three-state transition matrix.
Create the Markov chain that is characterized by the transition matrix P.
P = [0 1 0; 0 0 1; 1 0 0]; mc = dtmc(P);
Determine whether the Markov chain is ergodic.
ans = logical 0
0 indicates that the Markov chain is not ergodic.
Visually confirm that the Markov chain is not ergodic by plotting its eigenvalues on the complex plane.
All three eigenvalues have modulus one. This result indicates that the period of the Markov chain is three. Periodic Markov chains are not ergodic.
mc — Discrete-time Markov chain
Discrete-time Markov chain with
NumStates states and transition matrix
P, specified as a
P must be fully specified (no
tf — Ergodicity flag
Ergodicity flag, returned as
mc is an ergodic Markov chain and
A Markov chain is ergodic if it is both irreducible and aperiodic. This condition is equivalent to the transition matrix being a primitive nonnegative matrix.
By Wielandt's theorem , the Markov chain
mcis ergodic if and only if all elements of Pm are positive for m = (n – 1)2 + 1. P is the transition matrix (
mc.P) and n is the number of states (
mc.NumStates). To determine ergodicity,
By the Perron-Frobenius Theorem , ergodic Markov chains have unique limiting distributions. That is, they have unique stationary distributions to which every initial distribution converges. Ergodic unichains, which consist of a single ergodic class plus transient classes, also have unique limiting distributions (with zero probability mass in the transient classes).
 Gallager, R.G. Stochastic Processes: Theory for Applications. Cambridge, UK: Cambridge University Press, 2013.
 Horn, R., and C. R. Johnson. Matrix Analysis. Cambridge, UK: Cambridge University Press, 1985.
 Wielandt, H. "Unzerlegbare, Nicht Negativen Matrizen." Mathematische Zeitschrift. Vol. 52, 1950, pp. 642–648.
Introduced in R2017b