markov_chain | ||
1. | [ noun ] (simulation) a Markov process for which the parameter is discrete time values | |
Synonyms: | markoff_chain | |
Related terms: | markoff_process | |
Similar spelling: |
markoff_chain Markovich markovian mark_of_Cain |
markov_chain | ||
1. | [ noun ] (simulation) a Markov process for which the parameter is discrete time values | |
Synonyms: | markoff_chain | |
Related terms: | markoff_process | |
Similar spelling: |
markoff_chain Markovich markovian mark_of_Cain |