Markov chain
English
Noun
Markov chain (plural Markov chains)
- (probability theory) A discrete-time stochastic process with the Markov property.
Translations
probability theory
|
|
See also
This article is issued from
Wiktionary.
The text is licensed under Creative
Commons - Attribution - Sharealike.
Additional terms may apply for the media files.