Sinonim Kata
Tulis kata dalam Bahasa Indonesia atau Bahasa Inggris:

markov chain

English Thesaurus

1. a Markov process for which the parameter is discrete time values (noun.process)
:markoff process, markov process,
definition:a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state (noun.process)

Visual ArtiKata

click for definition
Explore markov chain in ArtiKata.com >


Cari berdasar huruf depan:

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Situs lain yang mungkin anda suka:

Kamus Bahasa Indonesia
Rima Kata