Markov chain

views updated

Markov chain In statistics, a set of sequential observations in which the probability of one member of the sequence occurring conditional on all the preceding members occurring is equal to the probability of that member occurring conditional only on the immediately preceding member occurring.

More From encyclopedia.com