markov_process | ||
1. | [ noun ] (simulation) a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state | |
Synonyms: | markoff_process | |
Related terms: | stochastic_process markoff_chain | |
Similar spelling: |
markoff_process |