Markov chain matrix

Markov chain can be compactly/ efficiently described using matrices + vectors.

For a vector or probabilities (can be a prior distribution) of the possible states :

is the transition matrix where

Note that the columns of the transition matrix must add up to 1. The probabilities of all the possible states given a current state must add up to 1 by probability axioms.


References