I am trying to get the probability distribution function of $Z=X-Y$. Given that $f_X(x)$ and $f_Y(y)$ are known, and both variables are chi-square distributed, $X\in\mathbb{R}$, $X\ge 0$, and similarly $Y\in\mathbb{R}$, $Y\ge 0$. So how can I get $f_Z(z)$.
Distribution of Difference of Chi-squared Variables
-
0Are they $\chi^2$-distributed with the same parameters, or different ? – 2011-11-24
-
3Regardless of the distribution, you need to know the _joint_ distribution of $X$ and $Y$ in order to answer the question. For the special case when $X$ and $Y$ are _independent_, the density of $X-Y$ is the _convolution_ of the densities of $X$ and $-Y$, in particular, $$f_{X-Y}(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(x-z)\mathrm dx.$$ As Sasha hints, this integral is easier to compute when the densities of $X$ and $Y$ have the same scale parameter. – 2011-11-24
-
0@DilipSarwate : I took Sasha's comment to be about degrees of freedom. The number of degrees of freedom is not a scale parameter. – 2011-11-24
-
0@MichaelHardy : Sasha wrote _parameters_ and so could have meant both scale and degrees of freedom. As you know, $\Chi^2$ random variables are also Gamma random variables, and the sum of independent Gamma random variables with the same scale parameter is a Gamma random variable with the same scale parameter and order parameter equal to the sum of the orders, but the density is a lot messier when the scale parameters are different. I suspect a similar situation for the density of the difference of two $\Chi^2$ random variables; easier to find when the scale parameters are the same. – 2011-11-24
-
0@Sasha: they have the same distribution paramters. – 2011-11-25
-
0A more general question: http://stats.stackexchange.com/questions/48378/difference-of-gamma-random-variables – 2016-06-18
1 Answers
Let's assume that $X$ and $Y$ are independent, and both follow $\chi^2$-distribution with $\nu$ degrees of freedom.
Then $Z = X-Y$ follows symmetric about the origin variance-gamma distribution with parameters $\lambda = \frac{\nu}{2}$, $\alpha=\frac{1}{2}$, and $\beta=0$ and $\mu = 0$.
The best way to see this is through the moment-generating function: $$ \mathcal{M}_X(t) = \mathcal{M}_Y(t) = \left(1-2 \, t\right)^{-\nu/2} $$ Then $$ \mathcal{M}_Z(t) = \mathcal{M}_X(t) \mathcal{M}_Y(-t) = \left( 1-4 t^2 \right)^{-\nu/2} = \left( \frac{1/4}{1/4-t^2}\right)^{\nu/2} $$ We now this that it match the moment generating function of the variance gamma distribution: $$ \mathcal{M}_{\rm{V.G.}(\lambda,\alpha,\beta,\mu)}(t) = \mathrm{e}^{\mu t} \left( \frac{\alpha^2 -\beta^2}{\alpha^2 - (\beta+t)^2 } \right)^\lambda $$ For the said parameters, $\mu=\beta=0$, $\alpha=\frac{1}{2}$ and $\lambda=\frac{\nu}{2}$, the density has the following form: $$ f_Z(z) = \frac{1}{2^{\nu/2}\sqrt{\pi}} \frac{1}{\Gamma\left(\frac{\nu}{2}\right)} \vert z \vert^{\tfrac{\nu-1}{2}} K_\tfrac{\nu-1}{2}\left(\vert z \vert \right) $$ The function $f_Z(z)$ is continuous at $z=0$ for $\nu > 1$, with $$ \lim_{z \to 0} f_Z(z) =\frac{1}{4 \sqrt{\pi }} \frac{\Gamma \left(\frac{\nu }{2}-\frac{1}{2}\right)}{ \Gamma \left(\frac{\nu }{2}\right)} $$
-
0Thanks, now I understand it better. I have never thought that I can use the MGF with the variable inverted in sign, i.e., $\mathcal{M}(-t)$. – 2011-11-25
-
0@Remmy This simply follows because $\mathcal{M}_Z(t) = \mathbb{E}(\exp(t Z)) = \mathbb{E}(\exp(t X) \exp(-t Y)) = \mathcal{M}_X(t) \mathcal{M}_Y(-t)$, because of independence. – 2011-11-25
-
0See also: [Chi-squared distribution#Linear combination](https://en.wikipedia.org/wiki/Chi-squared_distribution#Linear_combination). – 2016-06-18