2
$\begingroup$

Suppose $X$ is a continuous, nonnegative random variable with distribution function $F$ and probability density function $f$. If for $a>0,\ E(X|X>a)=a+E(X)$, find the distribution $F$ of $X$.

  • 0
    Is this a homework question? Also, what did you try? Where did you get stuck?2012-04-14
  • 3
    24 questions, 0 accepted???? please, hadisanji, fix that2012-04-14

3 Answers 3

-1

Yes, the exponential distribution is the ONLY continuous distribution satisfying the property, because the above differential equation has a unique solution. For the proof of this fact you can also have a look at the page on the exponential distribution at Statlect (the rate parameter and its interpretation - proof).

5

About the necessary hypotheses (and in relation to a discussion somewhat buried in the comments to @bgins's answer), here is a solution which does not assume that the distribution of $X$ has a density, but only that $X$ is integrable and unbounded (otherwise, the identity in the post makes no sense).

A useful tool here is the complementary PDF $G$ of $X$, defined by $G(a)=\mathrm P(X\gt a)$. For every $a\geqslant0$, let $m=\mathrm E(X)$. The identity in the post is equivalent to $\mathrm E(X-a\mid X\gt a)=m$, which is itself equivalent to $\mathrm E((X-a)^+)=m\mathrm P(X\gt a)=mG(a)$. Note that $m\gt0$ by hypothesis. Now, for every $x$ and $a$, $$ (x-a)^+=\int_a^{+\infty}[x\gt z]\,\mathrm dz. $$ Integrating this with respect to the distribution of $X$ yields $$ \mathrm E((X-a)^+)=\int_a^{+\infty}\mathrm P(X\gt z)\,\mathrm dz, $$ hence, for every $a\gt0$, $$ mG(a)=\int_a^{+\infty}G(z)\,\mathrm dz. $$ This proves ${}^{(\ast)}$ that $G$ is infinitely differentiable on $(0,+\infty)$ and that $mG'(a)=-G(a)$ for every $a\gt0$. Since the derivative of the function $a\mapsto G(a)\mathrm e^{ma}$ is zero on $a\gt0$ and $G$ is continuous from the right on $(0,+\infty)$, one gets $G(a)=G(0)\mathrm e^{-ma}$ for every $a\geqslant0$.

Two cases arise: either $G(0)=1$, then the distribution of $X$ is exponential with parameter $1/m$; or $G(0)\lt1$, then the distribution of $X$ is a barycenter of a Dirac mass at $0$ and an exponential distribution. If the distribution of $X$ is continuous, the former case occurs.

${}^{(\ast)}$ By the usual seesaw technique: the RHS converges hence the RHS is a continuous function of $a$, hence the LHS is also a continuous function of $a$, hence the RHS integrates a continuous function of $a$, hence the RHS is a $C^1$ function of $a$, hence the LHS is also a $C^1$ function of $a$... and so on.

  • 0
    (+1) Nice. My hunch is that the restriction to *continuous* random variables was thrown in there to avoid the geometric distribution, though the memorylessness of the latter only strictly holds for $a \in \mathbb N$.2012-04-29
  • 0
    @cardinal (Thanks.) Hmmm... Yes, you are probably right about the motivation for the restriction.2012-04-29
3

Hopefully there is a more elegant solution, but let us say $\mu=\mathbb{E}[X]$, and start with the definition of conditional expectation: $$ \eqalign{ \mathbb{E}[X|X>a] &=& \int_0^{\infty}x\,\mathbb{P}[X|X>a]\,dx \\ \mu + a &=& \int_{a}^{\infty}x\,\frac{f(x)}{1-F(a)}\,dx \\ \left(\mu+a\right)\,\left(1-F(a)\right) &=& \int_{a}^{\infty}x\,f(x)\,dx \,. } $$ Differentiating with respect to $a$, we find $$ \eqalign{ 1-F(a)-\left(\mu+a\right)f(a) &=& -a\,f(a) \\\\ 1-F(a)-\mu f(a) &=& 0 \\\\ F(a) + \mu F\,'(a) &=& 1 } $$ which is an ordinary differential equation, solvable by standard methods, e.g., by multiplying by the integrating factor:

$$ \eqalign{ F(x) + \mu F\,'(x) &=& 1 \qquad\text{for}\qquad x\ge0 \\\\ F\,e^{x/\mu} + \mu F\,'\,e^{x/\mu} &=& e^{x/\mu} \\\\ \left( \mu\,F\,e^{x/\mu} \right)' &=& e^{x/\mu} \\\\ \mu\,F(x)\,e^{x/\mu} &=& \int e^{x/\mu} dx = \mu \, e^{x/\mu} + c \\\\ \mu\,F(x) &=& \mu + c \, e^{-x/\mu} } $$ At $x=0$, since $X$ is continuous and nonnegative, it must be the case that $F(0)=0$, from which it follows that $c=\mu F(0)-\mu=-\mu$, giving us the CDF $$ F(x) = 1 - e^{-x/\mu} = 1 - e^{-\lambda x} $$ and the exponential density $$ f(x) = \frac1\mu\,e^{-\mu x} = \lambda \, e^{-\lambda x} $$ where the location parameter $\mu$ and (decay) rate parameter $\lambda$ are reciprocally related, i.e., $\lambda\mu=1$.

EDIT: There is indeed now a more elegant solution, thanks to Didier.

  • 1
    One thing that I can't figure out, the question does not require f to be continuous or anything particular, why is the derivative of $\int_a^\infty xf(x)$ equal to $-af(a)$?2012-04-26
  • 0
    @Ivan: I think it still follows from the [FTOC](http://en.wikipedia.org/wiki/Fundamental_theorem_of_calculus#First_part). [Formally](http://en.wikipedia.org/wiki/Probability_density_function#Formal_definition), a [continuous random variable](http://en.wikipedia.org/wiki/Continuous_random_variable#Continuous_probability_distribution) $X$ must have absolutely continuous **CDF** $F$, of which its density $f$ is the [Radon–Nikodym derivative](http://en.wikipedia.org/wiki/Radon%E2%80%93Nikodym_derivative#Radon.E2.80.93Nikodym_derivative), with respect to Lebesgue measure $\lambda$.2012-04-26
  • 0
    Thank you for replying, I'd love a clarification for my own self. Correct me if I'm wrong, but I thought that the FTOC can only guarantee that the derivative exists almost everywhere. This implies that the first order differential equation is only almost everywhere satisfied by $F(x)$, and therefore is not uniquely given by the exponential solution any more.2012-04-26
  • 0
    @Ivan, yours is a [good question](http://en.wikipedia.org/wiki/Probability_density_function#Further_details). The assumptions given, that $X$ is a continuous RV with distribution $F$, density $f$ and finite expectation, are probably sufficiently strong to warrant the result. In particular, $f$ is determined up to a set of Lebesgue measure zero. Therefore, I think it's safe to say that $F$ is completely determined, and this is what we are asked for. I evoked the FTOC to justify the manipulation with the lower endpoint. There may well be a more rigorous & elegant way to show this.2012-04-26
  • 0
    I don't think $f$ is uniquely determined. It cannot be derived from the differential equation $F(a)+\mu F'(a)=1$ since this equation is not valid on a set of measure zero. At least I don't know how to solve such equations. I agree that the exponential distribution is a solution to the above problem. I have a feeling that it might not be unique and I'd like to have an argument for or against my feeling...2012-04-27
  • 0
    @Ivan: We are given that $f=F'$ exists (so $F$ is [nonsingular](http://math.stackexchange.com/questions/49544/singular-distribution)) in the problem which, together with the other hypotheses, is strong enough for the solution presented. There may very well be a more general statement of the problem (under weaker assumptions), but I don't know it. I improvised the above, without consulting the literature. Try Feller volume 2 or a more recent text for a modern, general treatment such as perhaps Shiryayev or Durrett. Sorry I'm not more help, and +1 for a really good question.2012-04-27
  • 0
    @Ivan: See [my answer](http://math.stackexchange.com/a/138370/6179) to your (quite relevant) question.2012-04-29