0
$\begingroup$

Suppose we send a signal with random variable S and the received signal is Y=S+W where W is the random noise. Assume S,W are independent and have the same pdf.

Q1) Determine the best minimum mean square estimate of S based on Y?

I think the best estimate would be the conditional mean of S based on Y but I am not sure how to calculate such an expression without numbers?

Q2) Determine the best minimum mean square estimate of S based on Y if S and W are not independent?

I have no idea how Independence even affects the estimate.

1 Answers 1

2

Here is how to verify your guess.

For any estimator $f(Y)$, \begin{align} \mathbb{E}(f(Y) - S)^2 &= \mathbb{E}(f(Y) - \mathbb{E}[S \mid Y] + \mathbb{E}[S \mid Y] - S)^2\\ &= \mathbb{E}(f(Y) - \mathbb{E}[S \mid Y])^2 + 2 \mathbb{E}[(f(Y) - \mathbb{E}[S \mid Y])(\mathbb{E}[S \mid Y] - S)] + \mathbb{E}(\mathbb{E}[S \mid Y]-S)^2\\ &= \mathbb{E}(f(Y) - \mathbb{E}[S \mid Y])^2 + 0 + \mathbb{E}(\mathbb{E}[S \mid Y]-S)^2\\ &\ge \mathbb{E}(\mathbb{E}[S \mid Y]-S)^2. \end{align} (See below for why the cross term is zero.)

Choosing $f(Y) = \mathbb{E}[S \mid Y]$ makes the last inequality tight so it minimizes $\mathbb{E}(f(Y) - S)^2$ over all estimators $f(Y)$. Note that we can rewrite this estimator as $\mathbb{E}[S \mid Y] = Y - \mathbb{E}[W \mid Y]$.

This answer holds for both Q1 and Q2. However, I have not leveraged independence nor the assumption that $S$ and $W$ have the same pdf... perhaps they want you to explicitly write out what $\mathbb{E}[S \mid Y]$ looks like under those conditions.


To see that the cross term vanishes, note the tower property yields \begin{align} \mathbb{E}[(f(Y) - \mathbb{E}[S \mid Y])(\mathbb{E}[S \mid Y] - S)] &= \mathbb{E}\big[\mathbb{E}\big[(f(Y) - \mathbb{E}[S \mid Y])(\mathbb{E}[S \mid Y] - S) \mid Y\big]\big]\\ &= \mathbb{E}\big[(f(Y) - \mathbb{E}[S \mid Y]) (\mathbb{E}[S \mid Y] - \mathbb{E}[S \mid Y])\big]\\ &= 0. \end{align}