>As I understand it, a Markov process is a probablisitic >sequence of state transiitons. It's completely specified >by the state transition matrix. > >However, I sometimes see references to the >equlibrium condition, whether it's been reached, >etc. I don't get this, can anyone elaborate? >Thanks
In the absence of responses that might use the terminology better -
You start with some frequencies existing at each state, "starting state".
You apply the transition probabilities, and obtain new frequencies. Do it again. Do it again. ...
Where the frequencies stabilize, if they do, is called an equilibrium condition.
It is possible that the equilibrium will depend on the start.