Übersicht
Englisch Synonyms:   mehr Daten
  1. Markov chain:


Englisch

Detailed Synonyms for Markov chain in Englisch

Markov chain:

Markov chain [the ~] Nomen

  1. the Markov chain
    – a Markov process for which the parameter is discrete time values 1
    the Markoff chain; the Markov chain
    – a Markov process for which the parameter is discrete time values 1

Verwandte Definitionen für "Markov chain":

  1. a Markov process for which the parameter is discrete time values1

Related Synonyms for Markov chain