单词 | Markov process |
释义 | Markov process [`mär´kȯf prä·sǝs] MATHEMATICS A stochastic process which assumes that in a series of random events the probability of an occurrence of each event depends only on the immediately preceding outcome. |
随便看 |
|
科学参考收录了103327条科技类词条,基本涵盖了常见科技类参考文献及英语词汇的翻译,是科学学习和研究的有利工具。