A little problem I am asked to do, but that I don't get.
The definition of a Markov chain transition probability is :
$ P(x_{t+1}=j|x_t=i) = P_{ij} $, where there are n possible states
I am asked to prove that $ \sum_{j=1}^n P_{ij} = 1 $ for all i.
I just don't see how to prove this, as it seems like a trivial property that the sum of possible events' probabilities sum to 1. Any help? Thank you.