Let $x_1, x_2, \ldots ,x_n$ be positive real numbers, and let $$ S = x_1 + x_2 + \cdots + x_n.$$
Prove that $$ (1+x_1)(1+x_2)\ldots(1+x_n) \le 1 + S + \frac{S^2}{2!} + \cdots + \frac{S^n}{n!}$$
Here's my attempt:
Case 1 : Let $x_1 = x_2= x$ (When all terms are equal)
$LHS = (1+x)(1+x) = 1+2x+x^2$
$RHS = 1+\frac{x+x}{1!}+\frac{(x+x)^2}{2!}=1+2x+\color{blue}{2}x^2$
Hence when all terms are equal, $LHS Case 2 : $x_1\ne x_2$ $LHS=(1+x_1)(1+x_2)=1+{x_1}{x_2}+(x_1+x_2)$
$RHS=1+ (x_1+x_2)+\frac{(x_1+x_2)^2}{2!} =1+{x_1}{x_2}+(x_1+x_2)+\color{blue}{\left(\frac{x_1^2+x_2^2}{2}\right)}$ Hence when even one term is not equal, $LHS\lt RHS$. From the two cases, it's clear that under no circumstance $LHS \gt RHS$. And hence the (partial) proof. Is there any way I can improve my proof? If you look again I've considered only two terms and my intuition tells me that it would apply to the entire range but how do I state it mathematically? Or can you think of a better or more rigorous proof than this? Thanks again!