1
$\begingroup$

Suppose that a discrete random variable (with finite support) $Y$ is given by $Y = X_1 - X_2$, where $X_1$ and $X_2$ are both discrete random variables with finite support and with the same probability mass function. Is it possible to determine the pmf of $X_1$ from the pmf of $Y$?

3 Answers 3

3

I will consider the case when $X_1$ and $X_2$ are independent identically distributed and integer-valued.

The following properites hold:

  1. Since $Y$ is discrete with finite support, its pmf is a polynomial in $z$ and $z^{-1}$ with non-negative coefficients. $$ Z_Y(z) = \sum_{k=m(Y)}^{n(Y)} \mathbb{P}(Y=k) z^k $$
  2. Let $X$ be the purported solution with $Z_X(z) = \sum_{k=m(X)}^{n(X)} \mathbb{P}(X=k) z^k$.

  3. Since $Y = X_1-X_2$, for $X_1$, $X_2$ iids, $Z_Y(z) = Z_{X}(z) Z_{X}(z^{-1})$.

The problem you are posing is whether, given $Z_Y(z)$ one can determine $Z_X(z)$ such that $Z_Y(z) = Z_X(z) Z_X(z^{-1})$. Observe that the solution, if exists, is not unique, since $Z_{X^\prime}(z) = z^k Z_{X}(z)$ would also be a solution for arbitrary $k \in \mathbb{Z}$.

The necessary condition for existence of a solution is that $Z_Y(z)$ should verify $Z_{Y}(z) = Z_Y(z^{-1})$. Assuming that the given $Z_Y(z)$ satisfies this property, finding $Z_X(z)$ reduces to polynomial factorization.

  • 0
    In addition to the assumption of independence, is it also assumed that $X_1$ and $X_2$ are integer-valued? Because in general, the generating function of the pmf need not be a _polynomial_.2012-01-18
  • 0
    @DilipSarwate Yes, indeed. I will add this explicitly.2012-01-18
  • 0
    Hi Sasha, in my case they are iid and they have integer (strictly positive) values. could you give me some examples of functions that have the property you were mentioning for the uniqueness of the solution?2012-01-18
  • 0
    In particular, suppose the solution is unique. Could you tell me explicitly how to solve it?2012-01-18
  • 0
    I would use _Mathematica_ (or other computer algebra systems) to factor the $Z_Y(z)$. Let $Z_Y(z) = 5/9 + 2/(9 z) + (2 z)/9$. `Factor` gives `((2 + z) (1 + 2 z))/(9 z)` from where $Z_X(z)$ can be $z^{k} (\frac{2}{3} + \frac{1}{3} z)$ or $z^{k} ( \frac{1}{3} + \frac{2}{3} z)$. This gives you another ambiguity. If $Z_X(z)$ is a solution, so is $Z_X(z^{-1})$.2012-01-18
  • 0
    **Now** OP Bob confesses that $X_1$ and $X_2$ are independent and have strictly positive integer values in addition to being identically distributed. It would have saved a lot of everyone's time if this has been stated up front. And what part of "Observe that the solution, if exists, is not unique" does he not understand?2012-01-18
  • 0
    the reason I dont want to use Mathematica is that I need an algorithm that a processor can implement to find the pmf of $X_1$ (or $X_2$)...2012-01-18
1

No. If $X_1$ and $X_2$ are identically equal to any constant, then $Y$ is identically equal to zero.

  • 0
    this is a corner case. In my case, they are not identically equal to a constant2012-01-18
  • 0
    @Bob Are $X_1$ and $X_2$ independent identically distributed ? Byron is saying that without additional information on the joint law of $(X_1, X_2)$ it is impossible to determined pmf of $X_1$, and gives an example of why.2012-01-18
1

The answer below is no longer relevant since the OP now says that $X_1$ and $X_2$ take on strictly positive integer values and are also independent.


If you don't like Byron Schmuland's answer with degenerate random variables, here is an example with Bernoulli random variables showing that the information you have is insufficient to determine the identical marginal distributions of $X_1$ and $X_2$. More information about the joint distribution is needed.

Let $Y$ have pmf $p_Y(-1) = p_Y(+1) = 0.2, p_Y(0) = 0.6$. $X_1$ and $X_2$ thus are Bernoulli random variables. The two joint distributions shown below are consistent with $Y$. Both give rise to identical marginal distributions for $X_1$ and $X_2$, but the marginal distributions are different in the two cases:

$$p(0,1) = p(1,0) = 0.2; p(0,0) = p(1,1) = 0.3; \Rightarrow X_1, X_2 \sim \text{Bernoulli}(0.5)$$

$$p(0,1) = p(1,0) = 0.2; p(0,0) = 2p(1,1) = 0.4; \Rightarrow X_1, X_2 \sim \text{Bernoulli}(0.4)$$

In fact, for the given pmf of $Y$, we can choose to make $X_1, X_2 \sim \text{Bernoulli}(p)$ for any $p \in [0.2,0.8]$.