markov chain
English Thesaurus
1. a Markov process for which the parameter is discrete time values (noun.process)
hypernym | : | markoff process, markov process, |
definition | : | a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state (noun.process) |
Visual ArtiKata
Explore markov chain in ArtiKata.com >