0
$\begingroup$

This may be a silly question, but I cannot figure out a convincing (to myself) answer to it. Suppose that you want to buy a new car. Let $v$ be the value you attach to the car. Before visiting the dealer, you cannot tell for sure how much you value the car, i.e., you're uncertain about the true valuation of the car. However, you receive a observe the realization of a random variable $S\sim U[0,1]$, that tells you something about $v$: with probability $p$ the signal tells you the truth ($s=v$) and with probability $(1-p)$ the signal is noise, $s=\epsilon$, where $\epsilon$ is independent of $v$, and $\epsilon \sim U[0,1]$. This means that the expected value of $v$ given $s$ and $p$ is $\omega=ps+(1-p)\frac{1}{2}$.

Now, so long as $p>0$, the posterior of $\omega$ given $p$ and $s$ is: $$ \mathbb{P}[\omega \leq x]=\mathbb{P}\left[s \leq \frac{x - (1-p)1/2}{p} \right]=\frac{x - (1-p)1/2}{p} $$ because $s\sim U[0,1]$. My question is, how can I obtain the posterior when $p=0$? Any help to understand this is really appreciated it!

  • 0
    What do you mean by "because $s\sim U[0,1]$"? Is this new information being introduced, or something you're trying to infer from the information given in the first paragraph? How can you specify a distribution for $s$ when in one case $s$ is equal to $v$, which presumably is not random? Also, why is $\omega=ps+(1-p)\frac{1}{2}$ the expected value of $v$? Shouldn't it be that $\omega=pv+(1-p)\frac{1}{2}$ is the expected value of $s$?2012-09-20
  • 0
    @joriki Sorry I forgot to mention that $s$ is uniform $[0,1]$. I added an edit into my the question. $v$ is random because you don't know it. What you do know is the realization of a random variable $S$ (the signal) which tells you the truth with probability $p$. Hence, when you compute the expected value of your valuation, you do it conditional on what you have observed, i.e., your sigal $s$ and the probability $p$.2012-09-20
  • 0
    I find this presentation of the problem somewhat confusing. Let me check whether I understand it correctly by rephrasing it. The value $v$ of the car is random, and it's uniformly distributed between $0$ and $1$. There is also another value $\epsilon$, which is also uniformly distributed between $0$ and $1$. The signal $s$ has the value $v$ with probability $p$ and the value $\epsilon$ with probability $1-p$. You know $p$, and you observe $s$, and $\omega$ is the expected value of $v$ given that information. Is that equivalent to what you're saying?2012-09-20
  • 0
    @joriki yup. What you say is equivalent to the problem I presented when you add independence between $\epsilon$ and $s$.2012-09-21
  • 0
    You mean between $\epsilon$ and $v$?2012-09-21

1 Answers 1

0

Your result for $p\gt0$ is only correct if $\displaystyle\frac{x - (1-p)/2}{p}\in[0,1]$, which need not be the case.

If $p=0$, then $\omega=\frac12$, so $\mathbb{P}[\omega \leq x]$ is $0$ for $x\lt\frac12$ and $1$ for $x\ge\frac12$.