Markov chain

From WikiMD's Wellness Encyclopedia

(Redirected from Markov chains)

Markov chain is a stochastic process with the Markov property. The term "Markov chain" refers to the sequence of random variables such a process moves through, with the Markov property defining serial dependence only between the adjacent periods (as in a "chain"). It is named after the Russian mathematician Andrey Markov.

Definition[edit | edit source]

A Markov chain is a sequence of random variables X1, X2, X3, ... with the Markov property, namely that the probability of moving to the next state depends only on the present state and not on the previous states.

Properties[edit | edit source]

Markov chains have many properties, which are studied in topics such as stochastic processes, random walks, ergodic theory, and statistical mechanics. They are used as mathematical models of systems and processes in many fields.

Applications[edit | edit source]

Markov chains are used in various fields such as physics, chemistry, economics, social sciences, and engineering. They are particularly useful in the study of systems that follow a chain of linked events, which can be represented as states in a Markov chain.

See also[edit | edit source]

References[edit | edit source]


External links[edit | edit source]

Markov chain Resources

Contributors: Prab R. Tumpati, MD