0
$\begingroup$

Let

$ S \sim N(\mu, \sigma^2) $

be a normally distributed random variable with known $\mu$ and $\sigma^2$. Suppose, we observe

$ X = \begin{cases} T & \text{if $S \ge 0$}, \\ -T & \text{if $S<0$},\end{cases} $ where $T \in \mathbb{R}$. The probability distribution of $X$ is given by: $ p(x) = Q\left(\frac{-\mu}{\sigma}\right)\delta(x-T)+Q\left(\frac{\mu}{\sigma}\right)\delta(x+T) $

I want to optimize the value of $T$ such that $X$ conveys as much information about $S$ as possible.

My Attempt:

a. I tried minimizing the Kullback–Leibler divergence between the distribution of $X$ and $S$, but as mentioned here, it is not possible.

b. I tried to calculate the mutual information between the two distributions, it turned out to be independent of $\alpha$.

Is there any other way of formulating this problem? I feel quite confident that there must be such $T$ for which $X$ explains $S$ better, e.g., assume $\mu=10000$ then a value of $T$ near $10000$ will better explain $S$ than say $T=2$? One method in my mind was to match the moments of the two distributions but I am not sure if it is the optimal way in the sense of maximizing the information?

  • 0
    One can write things like $\Pr(S=s)$ or $\Pr(S\ge s)$, but presumably your piecewise definition should be of another random variable.2012-07-01

1 Answers 1

1

$P(S|X)$ is the same for any value of $T$. Hence, $X$ conveys the same information about $S$, no matter the value of $T$. No matter what $T$ you chose, you are only informing with $X$ if $S \geq 0$ or not.

A way for $X$ to be more informative about $S$ is to define it as:

$ X = \left\{ \begin{array}{ll} 1 & \mbox{if $S \geq \mu$};\\ -1 & \mbox{if $S < \mu$}.\end{array} \right. $

From your questions, you might be interested in Rate-distortion theory.

  • 0
    Oh i missed the change in 'if' conditions. Thanks2012-06-30