markoff chain Meaning, Definition & Usage

  1. noun a Markov process for which the parameter is discrete time values
    Markov chain.

WordNet