markoff_chain
1. [ noun ] (simulation) a Markov process for which the parameter is discrete time values
Synonyms: markov_chain
Related terms: markoff_process
Similar spelling:   markov_chain
  mark_of_Cain
  markoff_process