Imagine I have a biased coin such that the probability of heads is $P(H)=p$ and $P(T)=1-p$. The outcomes of consecutive flips are independent. If $W$ is the event of $p$ taking on a given value, and $F_1$ and $F_2$ are the results of the first and second coin flips, respectively, then the following holds by Bayes Rule (by this Cross Validated post):
$$P(W|F_1,F_2)=\frac{P(F_1,F_2|W)P(W)}{P(F_1,F_2)}=\frac{P(F_2|W,F_1)P(F_1|W)P(W)}{P(F_2|F_1)P(F_1)}.$$
Clearly, however, I need to know the value of $P(F_2|W,F_1)$ first. I understand that $P(W)$ is simply our prior, and that $P(F_1|W)$ is simply the probability of the event $F_1$ given the value of $p$ (i.e. for $F_1$ being tails, and $p = 0.2$, then $P(F_1|W)=0.8$). Intuitively, it seems as though $P(F_2|W,F_1)$ should always equal $P(F_2|W)$, as $F_1$ and $F_2$ are independent. Is this the case? How can I derive the value of $P(F_2|W,F_1)$?
Update: I believe I've generated a proof that $F_1$ and $F_2$ being independent implies $P(F_2|W, F_1)=P(F_2|W)$:
Proof: Assuming $P(W) \neq 0$ (i.e. conditional probabilities on $W$ are well-defined), $$P(F_2|F_1,W)=\frac{P(F_2,F_1,W)}{P(F_1,W)}=\frac{P(F_2,F_1|W)P(W)}{P(F_1|W)P(W)}=\frac{P(F_2,F_1|W)}{P(F_2|W)}=\frac{P(F_2|W)P(F_1|W)}{P(F_1|W)}$$
which equals $P(F_2|W)$.
Is this proof correct?