1
$\begingroup$

I ran into some problems while doing an exercise. The problem goes as follows:

Suppose we have two random independent variables $X$ and $Y$. Both are distributed normally with parameters $(0, \sigma^2)$. $\mathbb{P}(dx)=\frac{1}{\sigma\sqrt{2\pi}} \exp^{-\frac{x}{2\sigma^2}}dx$. For $\gamma \in \mathbb{R}$, we set $U = X \cos\gamma - Y\sin\gamma$ and $V = X \sin\gamma + Y\cos\gamma$. Show that $U$ and $V$ are independent, calculate their distribution function.

What I've tried:

I know that to check the independence I need to use $$\mathbb{E}(\varphi (X) \psi (Y) )= \mathbb{E}(\varphi(X)) \cdot \mathbb{E}(\psi(Y)) $$ For that I need to calculate $\mathbb{P}_U$, $\mathbb{P}_V$ and $\mathbb{P}_{(U,V)}$. There are two ways to do that, either pushforward measure or density function. So I'm stuck at calculating $\mathbb{P}_U$ since for pushforward measure I can't express $X$ and $Y$ by using only $U$ or $V$. And for density function I have a problem with trigonometric functions since it changes the sign according to the quadrant and so does an inequality $\mathbb{P}(X \cos\gamma - Y\sin\gamma\leq t)$.

Thanks in advance

  • 0
    I think you need to add the condition: $X$ and $Y$ are independent2012-09-12
  • 1
    The problem is simple IF you know that 1) the linear transformation of jointly normal variables is jointly normal 2) jointly normal variables are independent iff $E(XY)=E(X)*E(Y)$2012-09-12
  • 0
    Your idea is a good one, except that you should directly compute $P_{(U,V)}$. Either this measure is a product, and then $U$ and $V$ are independent, or it is not, and then they are not. (The distributions $P_U$ and $P_V$ will come as bonuses.) If something stops you in the computation of $P_{(U,V)}$, just say so.2012-09-12
  • 0
    Yeah, the condition of independency is correct It was essential for me in this exercice so I forgot to mention it.2012-09-12

2 Answers 2

4

It is straightforward to compute the joint density of $(U,V)$ from that of $(X,Y)$. Jacobians and the like are involved in the standard undergraduate treatment of this topic (which is often not understood very well by said undergraduates). In this instance, the Jacobian approach is easier since the transformation is linear. Even more strongly for this particular problem, the answer can be written down with nary a mention of Jacobians, expectations, and the like. The transformation in question is a rotation of axes, and since the joint density $f_{X,Y}(x,y)$ has circular symmetry about the origin, rotating the axes does not change the function: the joint density $f_{U,V}$ is the same function as $f_{X,Y}$, that is, $$f_{U,V}(u,v) = \frac{1}{2\pi\sigma^2}\exp\left(-\frac{u^2+v^2}{2\sigma^2}\right), -\infty < u, v < \infty$$ and the independence of $U$ and $V$ follows immediately: $$f_{U,V}(u,v) = \frac{1}{\sigma\sqrt{2\pi}}\exp\left(-\frac{u^2}{2}\right) \cdot \frac{1}{\sigma\sqrt{2\pi}}\exp\left(-\frac{v^2}{2}\right) = f_X(u)f_Y(v)$$

  • 0
    But how do we know that $f_X(u)=\frac{1}{\sigma\sqrt{2\pi}}\exp\left(-\frac{u^2}{2}\right)$ is the pushforward mesure of $X$ $Y$ in $U$? I mean shouldn't we find $f_{(X,Y)}(u)$ and $f_{(X,Y)}(v)$ separately and then compare with what we got in joint distribution?2012-09-12
  • 2
    I don't understand the question. You said that $X$ and $Y$ are independent zero-mean normal random variables with common variance $\sigma^2$. So, the joint density $f_{X,Y}$ is just $f_X\cdot f_Y$, which are also known to you. I gave a simple argument showing $f_{U,V} = f_{X,Y}$. Now, if you want to _prove_ that $U$ and $V$ are $N(0,\sigma^2)$ random variables, integrate $f_{U,V}$ with respect to $v$ or $u$ and verify this. The _joint density_ determines the marginal densities; from the marginal densities one can not, in general, determine the joint density (or even assert joint continuity).2012-09-12
  • 0
    Thank you for the answer. Another question is that I have some problems with counting the Jacobian. I know that I should get it equal to 1 in this situation. But considering that for $U$ we have $x=\frac{u-y\sin\gamma}{\cos\gamma}$ and $y=\frac{v-x\sin\gamma}{\cos\gamma}$ we take the derivatives $dx/du$ and $dy/du$ for the first column and in the end determinant equals to $\frac{1}{\cos^2\gamma}+\frac{1}{\sin^2\gamma}$. Maybe you've spotted some errors?2012-09-12
  • 2
    If $u = g(x,y)$ and $v = h(x,y)$, the entries in the Jacobian matrix are the _partial derivatives_ of $g(x,y)$ and $h(x,y)$ with respect to $x$ and $y$. For a nonsingular _linear_ transformation $\mathbf{Y} = \mathbf{X}G$ from _vector_ $\mathbf X$ to _vector_ $\mathbf Y$ with matrix $G$, the Jacobian is the _determinant_ of $G$ and $f_{\mathbf Y}(\mathbf y) = f_{\mathbf X}(\mathbf yG^{-1})/|\text{det}(G)|$. See, for example, [these notes](http://courses.engr.illinois.edu/ece313/fa2000/ppt/Lecture37.pdf) of mine.2012-09-12
-3

To check independence of U and V you actually need $ E(UV)=E(U)E(V)$. Since U and V are both linear combinations of X and Y even knowing that X and Y are independent does not make the independence of U and V obvious.

Now E(UV)= E([X cosγ - Y sinγ][X sinγ + Ycosγ]) expanding this we get

E(X$^2$ sinγ cosγ - Y$^2$ sinγ cosγ +XY cos$^2$γ -XY sin$^2$γ)=

E[(X$^2$-Y$^2$) sinγ cosγ +XY(cos$^2$γ - sin$^2$γ)]

Now we recognize the double angle indentities from trigonometry

cos2γ = cos$^2$γ - sin$^2$γ and

sin2γ = 2 sinγ cosγ

substituting in we get E[(X$^2$-Y$^2$) sin2γ/2 + XY cos2γ]= E[(X$^2$-Y$^2$) sin2γ + 2XY cos2γ]/2= (sin2γ/2) E[X$^2$-Y$^2$] + cos2γ E[XY]= (sin2γ/2) (E[X$^2$]-E[Y$^2$])+ cos2γ E[XY] Assuming X and Y are independent we get

(1) (sin2γ/2) (E[X$^2$]-E[Y$^2$])+ cos2γ E[X] E[Y]

Now consider E[U] E[V] =E[X cosγ - Y sinγ]E[X sinγ + Y cosγ] =

(cosγ E[X] - sinγ E[Y])(sinγ E[X] + cosγ E[Y])=

sinγ cosγ E$^2$[X] + (-sin$^2$γ + cos$^2$γ)E[X] E[Y] - sinγ cosγ E$^2$[Y]

Applying the double angle formulae here we get

sinγ cosγ E$^2$[X] + cos2γ E[X] E[Y] - sinγ cosγ E$^2$[Y]=

(2) cos2γ E[X] E[Y] + [(sin2γ)/2] (E$^2$[X] - E$^2$[Y]).

This is the same as equation 1. Hence E[UV]=E[U] E[V] and U and V are independent.

  • 5
    This answer is dangerously close to the (false) statement that, for every random variables R and S, if E(RS)=E(R)E(S) then R and S are independent.2012-09-12
  • 1
    I am also confused why the sufficient condition of independence is that product of random variables is separable. It's not hard to find counter-example to prove it's false.2012-09-12
  • 0
    @did As leonboy pointed out we are dealing variables U and v that are jointly normal because X and Y are IID normal. Of course independence of X and Y in general means for any measurable sets A and B P(X contained in A and Y contained in B)= P(X contained in A) P(Y contained in B).2012-09-12
  • 0
    @user974514 by product of random variables being separable do you mean the expectaion of the product equals the product of the expectations? If so look at leonboy's comment.2012-09-12
  • 1
    Since the dangerous proximity I mentioned is confirmed, you might want to modify your post, to make it more explicit on this point (which is well known to be one over which students quite frequently stumble, for generations...).2012-09-13
  • 1
    I second Didier's suggestion, and also wish to point out that a simpler approach is to say that the covariance must be zero, giving $$\begin{align}\text{cov}(U,V)&=\text{cov}(X\cos\gamma-Y\sin\gamma,X\sin\gamma+Y\cos\gamma)\\&=(\sigma_X^2-\sigma_Y^2)\sin\gamma\cos\gamma+\text{cov}(X,Y)(\cos^2\gamma-\sin^2\gamma)\\&=0\end{align}$$ since $\sigma_X^2=\sigma_Y^2=\sigma^2$ and $\text{cov}(X,Y)=0$ from the equal-variance and independence hypotheses.2012-09-13
  • 0
    @DilipSarwate Your solution is nice and more elegant. However I think that the choice of sines and cosines for the coefficients indicates that the writer of the question may have wanted to see some use of trigonometry through the double angle formulae. My proof is longer but exploited the double angle formulae which i think was part of the intent. Otherwise the problem could have been posed with other coefficients not involving trigometic functions. You avoided the trigometry identities and exploited Cov(X,Y) =0 and σ$^2$$_x$=σ$^2$$_y$.2012-09-13
  • 0
    -1 for the absence of mention of the specificity of gaussian families with respect to independence/zero-covariance.2012-09-14
  • 0
    @did That downvote is very unfair. It is very well known that two Gaussian random variables are indpendent iff their covariance is 0 which is equivalent to E(XY)=E(X)E(Y). Leonboy pointed this out before my final edit of my answer. So I felt it was unnecessary for me to mention it also. readers will see leonboy's comment before they get to my answer.2012-09-14
  • 2
    *It is very well known that two Gaussian random variables are indpendent iff their covariance is 0*... Bingo! This is wrong as stated: [X gaussian, Y gaussian, Cov(X,Y)=0] **does not imply** [X and Y independent].2012-09-15
  • 0
    bivariate normal Why do you want to nit pick. You know my point and since X and Y are independent normal two linear combinations of X and Y are bivariate normal. You obviously know that don't you?2012-09-15
  • 2
    Then just SAY SO in your answer. As I said from the very beginning, this is a widely misunderstood subject (by students and by practitioners alike) hence the utmost precision is required. If someone deliberately neglects to be careful about it, despite clear prodding by others, leniency is not in order. (As an aside, to refer to other answers to justify not correcting some defects in one's own strikes me as rather odd.)2012-09-15