A sequence of random variables x1,…, xn,… converges in probability to a random variable x if for every positive number ε the probability of the (Euclidean) distance between xn and x exceeding ε converges to zero as n tends to infinity, so
for every ε > 0. This means that, if we consider a sequence of probabilities, Pn = P[|xn − x| ≥ ε], then starting from some n0 each probability in this sequence is arbitrarily small. In particular, x can be a constant. Convergence in probability implies convergence in distribution (the converse does not hold, in general).