Englisch
Detailed Synonyms for Markov chain in Englisch
Markov chain:
Markov chain [the ~] Nomen
-
the Markov chain
– a Markov process for which the parameter is discrete time values 1the Markoff chain; the Markov chain– a Markov process for which the parameter is discrete time values 1