I have the following scenario:
A fair coin is tossed, and we record the result "heads" or "tails" as random variable $Z$. If a head is observed, a random sample $X_1,\dots,X_n \sim \text{Bernoulli}(\theta)$ is collected, with fixed sample size $n$. If a tail is observed, a random sample $X_1,X_2,\ldots \sim \text{Bernoulli}(\theta)$ is collected until $k$ successes are obtained, for some $k I must show that the distribution of $Z$, given $N$ and $M$, does not depend on $\theta$. Say $Z=0$ represents "heads" and $Z=1$ represents "tails". We can then find conditional distributions for random variables $N$ and $M$. Here is what I have found so far. $$\text{P}[N=j\mid Z=0]=\begin{cases}0 & \text{if } j\neq n \\ 1 & \text{if } j=n\end{cases} \\ \text{P}[N=j\mid Z=1]=\binom{j-1}{k-1} \theta^k(1-\theta)^{j-k}, j=k,k+1,\dots$$ $$\text{P}[M=j\mid Z=0]=\binom{n}{j}\theta^j(1-\theta)^{n-j}, j=0,1,\dots,n\\ \text{P}[M=j\mid Z=1]=\begin{cases}0 & \text{if } j\neq k \\ 1 & \text{if } j=k \end{cases}$$ I also know the following result, by the definition of conditioning and using the law of total probability.
\begin{eqnarray}
\text{P}[Z=z\mid N=j,M=j]&=&\frac{\text{P}[N=j,M=j\mid Z=z]\text{P}[Z=z]}{\text{P}[N=j,M=j]} \\
&=&\frac{\text{P}[N=j,M=j\mid Z=z]\text{P}[Z=z]}{\sum_z\text{P}[N=j,M=j\mid Z=z] \text{P}[Z=z]}
\end{eqnarray} Here is where I am stuck. I cannot figure out how to use the above information to finish this computation. In particular, one of the distributions above involves $k$, and the other one does not--will I ever be able to get rid of the dependency on $\theta$ then? I would find any hints or steps on how to proceed extremely helpful.