markoff_process | ||
1. | [ noun ] (simulation) a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state | |
Synonyms: | markov_process | |
Related terms: | stochastic_process markoff_chain | |
Similar spelling: |
markov_process markoff_chain |