(n)a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state, Syn.Markoff process
The Collaborative International Dictionary of English (GCIDE) v.0.53Collaborative International Dictionary (GCIDE)
n. [ after A. A. Markov, Russian mathematician, b. 1856, d. 1922. ] (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It differs from the more general Markov process in that the states of a Markov chain are discrete rather than continuous. Certain physical processes, such as diffusion of a molecule in a fluid, are modelled as a Markov chain. See also random walk. [ Also spelled Markoff chain. ] [ PJC ]
n. [ after A. A. Markov, Russian mathematician, b. 1856, d. 1922. ] (Statistics) a random process in which the probabilities of states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It is distinguished from a Markov chain in that the states of a Markov process may be continuous as well as discrete. [ Also spelled Markoff process. ] [ PJC ]