0
$\begingroup$

A little problem I am asked to do, but that I don't get.

The definition of a Markov chain transition probability is :

$ P(x_{t+1}=j|x_t=i) = P_{ij} $, where there are n possible states

I am asked to prove that $ \sum_{j=1}^n P_{ij} = 1 $ for all i.

I just don't see how to prove this, as it seems like a trivial property that the sum of possible events' probabilities sum to 1. Any help? Thank you.

1 Answers 1

0

Say I have a a state space $E$. Then I can say that

$$\sum_{j \in E}P_{ij}=\sum_{j \in E}P(X_{n+1}=j|X_n=i)=$$$$P(\cup_{j \in E}(X_{n+1}=j|X_n=i)=P(X_{n+1}\in E|X_n=i)=1$$