Canadian French Meaning of markoff chain

Chaîne de Markov

Other Canadian French words related to Chaîne de Markov

No Synonyms and anytonyms found

Definitions and Meaning of markoff chain in English

Wordnet

markoff chain (n)

a Markov process for which the parameter is discrete time values

FAQs About the word markoff chain

Chaîne de Markov

a Markov process for which the parameter is discrete time values

No synonyms found.

No antonyms found.

markoff => Markov, markman => Markman, markka => markka, markisesse => marquise, markis => Marquis,