请输入您要查询的字词:

 

单词 Markov chain
释义
Markov chain

Mathematics
  • Consider a stochastic process X1, X2, X3,…in which the state space is discrete. This is a Markov chain if the probability that Xn + 1 takes a particular value depends only on the value of Xn and not on the values of X1, X2,…, Xn − 1. (This definition can be adapted so as to apply to a stochastic process with a continuous state space, or to a more general stochastic process {X(t), tT}, to give what is called a Markov process.) Examples include random walks and problems in queuing theory.

    Most Markov chains are homogeneous, so the probability that Xn + 1 = j given that Xn = i, denoted by pij, does not depend on n. In that case, if there are N states, these values pij are called the transition probabilities and form the transition matrix [pij], an N × N row stochastic matrix. See communicating class, recurrent, stationary distribution.


Statistics
  • See Markov process.


Computer
  • A sequence of discrete random variables such that each member of the sequence is probabilistically dependent only on its predecessor. An ergodic Markov chain has the property that its value at any time has the same statistical properties as its value at any other time.


Geology and Earth Sciences
  • In statistics, a set of sequential observations in which the probability of one member of the sequence occurring conditional on all the preceding members occurring is equal to the probability of that member occurring conditional only on the immediately preceding member occurring.


Economics
  • A stochastic process described by a finite number of states and known probabilities of moving from any given state to other states. These probabilities depend only on the current state and do not depend on the previous history.


随便看

 

科学参考收录了60776条科技类词条,基本涵盖了常见科技类参考文献及英语词汇的翻译,是科学学习和研究的有利工具。

 

Copyright © 2000-2023 Sciref.net All Rights Reserved
京ICP备2021023879号 更新时间:2024/6/30 23:44:28