$X$ is a random variable with normal distribution, assume $Y=X$, what is the mutual information $I(X;Y)$?
I guess that $h(Y|X)=0$ since when $X$ is known, $Y$ is completely known, so $I(X;Y)=h(Y)-h(Y|X)=h(Y)=\frac{1}{2}\log 2\pi e\sigma^2$ nat.
But, I was told I was wrong! and a numerical computation also shows that the value of $I(X;Y) \neq \frac{1}{2}\log 2\pi e\sigma^2$ Where is my mistake? Please help me out of this problem, thanks a lot! (Please note that $X$ and $Y$ are both continuous).