markov chain Meaning, Definition & Usage

  1. noun a Markov process for which the parameter is discrete time values
    Markoff chain.

WordNet