Statistical Glossary
Markov Chain:
A Markov chain is a series of random values x1, x2, … in which the probabilities associated with a particular value xi depend only on the prior value xi-1. For this reason, a Markov chain is a special case of “memoryless” random processes.
The index “i” usually corresponds to discrete values of time or another unidimensional argument. The values themselves can be of any nature and dimension. For example, the random value could be a behavior category (physical interaction, no physical interaction) and the sequence would be a set of simulated behaviors in which the probability of a given behavior depends on the prior behavior.
This property of “memorylessness” fostered the development of a rich probabilistic theory of Markov chains and related statistical methods as well. Thus, if a Markov chain is an adequate model for your data, you can use a wide variety of well-developed mathematical and statistical tools.
The concept of Markov process is the extension of the concept of Markov chain into the case of continuous argument (usually, time). Markov fields is the further extension of Markov processes into multidimensional space.
If the symbols do not display properly, try
the graphic version of this page