2
$\begingroup$

I have a prior distribution,

$$p(\boldsymbol\theta|\pi)=\prod\limits_{i=1}^K p(\theta_i|\pi).$$

$\theta_i$ can equal $0$ or $1$, so I am using a Bernoulli distribtion so that

$$p(\boldsymbol\theta|\pi)=\prod\limits_{i=1}^K \pi^{\theta_i}(1-\pi)^{1-\theta_i}.$$

I then want to add this distribution onto my marginal likelihood to make up my posterior. Should I solve it as $$p(\boldsymbol\theta|\pi)=\pi^{K\theta_i}(1-\pi)^{K(1-\theta_i)} \, \, ?$$

But then is the product of bernoulli distributions the binomial distribution?

Then should my answer be

$$p(\boldsymbol\theta|\pi)=\left(\begin{array}c K\\ t \end{array}\right)\pi^{t}(1-\pi)^{K-t)} $$

where $K$ is the maximum number of $\theta_i$'s allowed, and $t=\{0, 1\}$ , (i.e. $t=0\, \, \text{or}\, \, 1$)?

What form do I add this prior to my likelihood?

1 Answers 1