2
$\begingroup$

I have a prior distribution,

$p(\boldsymbol\theta|\pi)=\prod\limits_{i=1}^K p(\theta_i|\pi).$

$\theta_i$ can equal $0$ or $1$, so I am using a Bernoulli distribtion so that

$p(\boldsymbol\theta|\pi)=\prod\limits_{i=1}^K \pi^{\theta_i}(1-\pi)^{1-\theta_i}.$

I then want to add this distribution onto my marginal likelihood to make up my posterior. Should I solve it as $p(\boldsymbol\theta|\pi)=\pi^{K\theta_i}(1-\pi)^{K(1-\theta_i)} \, \, ?$

But then is the product of bernoulli distributions the binomial distribution?

Then should my answer be

$p(\boldsymbol\theta|\pi)=\left(\begin{array}c K\\ t \end{array}\right)\pi^{t}(1-\pi)^{K-t)} $

where $K$ is the maximum number of $\theta_i$'s allowed, and $t=\{0, 1\}$ , (i.e. $t=0\, \, \text{or}\, \, 1$)?

What form do I add this prior to my likelihood?

1 Answers 1

1

The equation you have can be represented as follows: $p(\boldsymbol x|\theta)=\prod\limits_{i=1}^K \theta^{x_i}(1-\theta)^{1-x_i}=\theta^{\sum_i x_i}(1-\theta)^{K-\sum_i x_i}$

We have the Bayes rule

$p(\theta|x)=\frac{p(x|\theta)p(\theta)}{p(x)}$

as $\theta$ is known, we have the joint density $p(x,\theta)=p(\theta,x)$ which specifies all the information we need.

  • 0
    @Ellie in your representation $\theta$ and in my representation $x$ are unknown. They are samples from a set $\{0,1\}$ but finally as $\theta_i\neq\theta\forall i$ you cannot write the equation before the last one in your question.2012-08-23