3
$\begingroup$

I want to construct independent non-negative random variables $X_1,X_2,X_3$ and $X_4$ such that $\mathbb{E}(X_n)=n$ and then maximize the probability: $$\mathbb{P}\left(X_1+X_2+X_3+X_4 \geq 11\right)$$

By Markov's inequality this must be smaller or equal to 10/11 but I would not know how to actually maximize this.

  • 1
    Hmm, the principle of maximum entropy (or possibly minimum Fisher information) comes to mind, since you want to find pdf's under some conditions of expectation, which are incorporated using Lagrange-multipliers, but I don't have enough experience with this method to utilize here.2017-01-27
  • 0
    Are your $X_n$ discrete or continuous ?2017-01-27
  • 0
    If we take $X_n$ as being exponential with mean $n$, extensive simulations give $\mathbb{P}\left(X_1+X_2+X_3+X_4 \geq 11\right) \approx 0.352$. What kind of distribution(s) can do better ?2017-01-27
  • 0
    I wonder what is the motivation of your question, in particular why 11, because 10 (which is the mean) would be more natural...2017-01-27
  • 0
    @JeanMarie The mean is more natural, this is why I find this more interesting, I find it hard to maximize this while respecting the expectation constraint. I would also be interested in a more general approach for any number bigger then 10.2017-01-27
  • 0
    Looking for one unknown pdf with a certain constraint is already difficult, but looking for four of them, is a tremendous task...2017-01-27
  • 0
    After some testing I found that the probability I was after for $X_n$ a poisson distribution has a probability of about 0.422017-01-27
  • 0
    I simulate it and i got around 0.82. For weaker results you can set some specific distributions with around 0.64 with simple proof2017-01-27
  • 0
    @JeanMarie Assuming $X_i$ independent ties your hands behind your back (positively correlated RVs will generally have bigger tail probabilities for the sum). See my answer below.2017-01-28
  • 1
    It's somehow relevant to "Feige's Conjecture"2017-01-28
  • 0
    So to clarify: These variables are independent and can be either discrete, continuous or a mixture ofcourse.2017-01-28
  • 0
    @Jan For the future, I think best procedure when a question's been answered and you realize you wanted to ask a different one is to ask a new question (and you can link to it on here). Was getting downvoted so deleted my response.2017-01-28
  • 1
    @Axolotl Yes! assuming Feige's conjecture the best you can do for this problem is $1-1/e \approx .632 $ (so I don't know how you're getting .82)2017-01-28
  • 0
    @spaceisdarkgreen My apologies then, I will do so next time! Thank you for your answer though :) I think that conjecture might then be wrong. If I take $$X_n=\left\{ \begin{array}{ll} n\frac{11}{10} & \mathrm{with probability } \frac{10}{11} \\ 0 & \mathrm{else} \end{array} \right.$$ we get $$\mathbb{P}\left(X_1+X_2+X_3+X_4 \geq 11\right) = \left(\frac{10}{11}\right)^4 \approx 0.683 >0.632$$2017-01-28
  • 0
    Ahh my mistake, Jan, you're right. I misinterpreted Feige's conjecture. (I thought we could make mean $1$ RVs by taking $X_2-1$, $X_3-2$ etc, and we can, but they won't be nonnegative as Feige requires them to be ) so like @Axolotl originally said they're only related, not the same thing.2017-01-28
  • 0
    @spaceisdarkgreen Via MATLAB simulation2017-01-29

1 Answers 1

2

Thanks to Axolotl's comment, we see this is related to Feige's Conjecture. In Feige's Paper he references a much older conjecture (Conjecture 2) due to Samuels that would answer your problem. It says

Let $X_1\ldots X_n$ be independent non-negative random variables with means $\mu_1\le\mu_2\le\ldots \le \mu_n.$ Then for every $\lambda>\sum_k\mu_k$ there is some $i$ with $1\le i\le n$ such that $P(\sum_kX_k\ge \lambda)$ is maximized when the $X_j$ are distributed as follows: 1) For $j

Fortunately, he also claims that Samuels has proven this for $n\le 4$ in these two papers. So it appears this you can get the answer for finding the best value of $i$ above, though the proof might take a bit to work through.

  • 0
    Thank you for your answer, this means that for my problem we will have a maximum probability of $\frac{4}{5}$.2017-01-28
  • 0
    @Jan, yep! That's what I get too.2017-01-28
  • 0
    @Jan Thanks to both of you, $\frac{4}{5}$ is exactly what i achieved via simulations...2017-01-29
  • 0
    @spaceisdarkgreen Can you find an example in which $i2017-01-30
  • 0
    @Axolotl Yes, just take $\lambda$ very large, then it starts to make more sense to give each variable a shot. For instance if we look at $P(X_1+X_2+X_3+X_4\ge 30),$ then if we let the first three be constant and $X_4 =24$ w.p. 4/24, and $0$ otherwise we get$1/6.$ However If we let each $X_1,X_2,X_3,X_4 = 30$ with probabilities $1/30,2/30,3/30,4/30$ and zero otherwise the probability the sum is $\ge 30$ is $1-(29/30)(28/30)(27/30)(26/30)\approx 0.3$2017-01-30
  • 0
    Yeah Thanks. But could it be something not extreme? In Conj1 of Feige's paper it claim something that it's impossible2017-01-30
  • 0
    @Axolotl Yeah, I agree that's what the conjecture seems to be saying, but notice that the expectations need to be all less than one. For this example, it's actually not true! For a range of about $\lambda = 12.76-12.79$, $i=2$ is the best https://www.wolframalpha.com/input/?i=plot+%7B1-(x-10)%2F(x-6),1-(x-7)(x-6)%2F(x-3)%5E2,1-(x-5)(x-4)(x-3)%2F(x-1)%5E3,1-(x-4)(x-3)(x-2)(x-1)%2Fx%5E4%7D+x+%3D+10+to+30 https://www.wolframalpha.com/input/?i=plot+%7B1-(x-10)%2F(x-6),1-(x-7)(x-6)%2F(x-3)%5E2,1-(x-5)(x-4)(x-3)%2F(x-1)%5E3,1-(x-4)(x-3)(x-2)(x-1)%2Fx%5E4%7D+x+%3D+12.75+to+12.812017-01-30
  • 0
    @spaceisdarkgreen Wow. Thanks a lot...2017-01-31