Some, but not all, Markov processes have a steady-state distribution (probability for being in each of the states). If the probability distribution for the current state equals the steady-state distribution, so will the distribution after the next transition (and any future transitions). In some cases you reach steady-state after a finite number of transitions. (Contrary to the current zombie craze, once you're dead, your probability of staying dead hovers around 1.0.) Sometimes you approach steady-state asymptotically. Sometimes there is no steady-state.