To find the stationary distribution of a Markov Chain, I believe I must solve for $\vec{s} = \langle s_0, s_1 \rangle$ in $\vec{s} = \vec{s}Q$, where $Q$ is the transition matrix.
$Q$, in my case, is
$ \left( \begin{array}{cc} p & 1-p \\ 1-q & q \end{array} \right) $
where $Q_{ij}$ is the probability of moving from state $i$ to state $j$ (row $i$, column $j$). When I solve for $s_0$ and $s_1$, however, I get
$ s_0 = s_0 p + s_1 (1-q) \\ s_1 = s_0 (1 - p) + s_1 q $
Subsequently,
$ s_0 (1 - p) = s_1 (1 - q) \\ s_1 (1 - q) = s_0 (1 - p) $
These two equations look identical. Does that mean there are an infinite number of stationary distributions for this Markov chain?
Thanks for helping a Markov Chain newb :)