Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » sci.math.* » sci.stat.math.independent

Topic: Markov equilibrium
Replies: 4   Last Post: Apr 23, 2013 6:21 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Paul

Posts: 26
Registered: 1/3/11
Markov equilibrium
Posted: Apr 22, 2013 6:23 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

Some, but not all, Markov processes have a steady-state distribution (probability for being in each of the states). If the probability distribution for the current state equals the steady-state distribution, so will the distribution after the next transition (and any future transitions). In some cases you reach steady-state after a finite number of transitions. (Contrary to the current zombie craze, once you're dead, your probability of staying dead hovers around 1.0.) Sometimes you approach steady-state asymptotically. Sometimes there is no steady-state.

Paul




Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.