Consider a stochastic process X1, X2, X3,…in which the state space is discrete. This is a Markov chain if the probability that Xn + 1 takes a particular value depends only on the value of Xn and not on the values of X1, X2,…, Xn − 1. (This definition can be adapted so as to apply to a stochastic process with a continuous state space, or to a more general stochastic process {X(t), t ∈ T}, to give what is called a Markov process.) Examples include random walks and problems in queuing theory.
Most Markov chains are homogeneous, so the probability that Xn + 1 = j given that Xn = i, denoted by pij, does not depend on n. In that case, if there are N states, these values pij are called the transition probabilities and form the transition matrix [pij], an N × N row stochastic matrix. See communicating class, recurrent, stationary distribution.