17
$\begingroup$

$ \int_0^{\infty } \frac{\log (x)}{e^x+1} \, dx = -\frac{1}{2} \log ^2(2) $

Anyone an idea on how to prove this?

  • 0
    FYI I picked up the problem on another forum. It is part of the solution of another problem. http://www.mymathforum.com/viewtopic.php?f=22&t=262772012-01-19

2 Answers 2

15

Start with $J(s)$ given by $ J(s) = \int_0^\infty \frac{x^s}{1+e^x}dx. $ Expand the denominator using geometric series, like so: $ J(s) = \sum_{k\geq0}\int_0^\infty (-1)^k x^s e^{-(1+k)x}dx$ $ = \sum_{k\geq1} \frac{(-1)^{k+1}}{k^{s+1}} \int_0^\infty x^s e^{-x}dx$ Now, the sum is the Dirichlet eta function, related to the Riemann zeta function like so, $ \sum_{k\geq1}\frac{(-1)^{k+1}}{k^{s+1}} = (1-2^{-s})\zeta(s+1), $ and the integral is $\Gamma(1+s)$. Thus $ J(s) = (1-2^{-s})\zeta(1+s) \Gamma(1+s). $

To find the derivative at $s=0$ we need the Laurent series for each of these functions at $s=0$, ($\zeta(1+s)$ is singular at $s=0$, but $1-2^{-s}$ has a zero there, so $J$ is regular), they are $ (1-2^{-s})\zeta(1+s) = \log2 + (\gamma \log 2 - \frac{(\log 2)^2}{2})s + O(s^2), $ $ \Gamma(1+s) = 1 - \gamma s + O(s^2), $ where $\gamma$ is Euler's constant. Multiplying the two series and taking the coefficient of $s$, we get $ \frac{d J}{ds}(0) = -\frac12 (\log 2)^2, $ which is the integral you were looking for.

  • 1
    Thanks. I had to look up the $\eta$, Gamma and Riemann zeta functions and their expansions on MathWorld. The series expansion $\zeta(1+s)=1/s+\gamma+O(s)$ is standard, and $1-2^{-s} = (\log 2)s - \frac12(\log2)^2s^2 + O(s^3)$ is just Taylor series.2012-01-19
21

By the recursive relation $\Gamma(x+1)=x\Gamma(x)$, we get $ \small{\log(\Gamma(x))=\log(\Gamma(n+x))-\log(x)-\log(x+1)-\log(x+2)-\dots-\log(x+n-1)}\tag{1} $ Differentiating $(1)$ with respect to $x$, evaluating at $x=1$, and letting $n\to\infty$ yields $ \begin{align} \frac{\Gamma'(1)}{\Gamma(1)}&=\log(n)+O\left(\frac1n\right)-\frac11-\frac12-\frac13-\dots-\frac1n\\ &\to-\gamma\tag{2} \end{align} $ Next, apply $(2)$ to the following: $ \begin{align} \int_0^\infty\log(t)\;e^{-t}\;\mathrm{d}t &=\left.\frac{\mathrm{d}}{\mathrm{d}x}\int_0^\infty t^x\;e^{-t}\;\mathrm{d}t\right]_{x=0}\\ &=\Gamma'(1)\\ &=-\gamma\tag{3} \end{align} $ Then, a simple change of variables yields $ \int_0^\infty\log(t)\;e^{-nt}\;\mathrm{d}t=-\frac{\gamma+\log(n)}{n}\tag{4} $ Since $\dfrac{1}{e^t+1}=e^{-t}-e^{-2t}+e^{-3t}-e^{-4t}+\dots$, by applying $(4)$ to this result, we have that $ \begin{align} \int_0^\infty\frac{\log(t)}{e^t+1}\mathrm{d}t &=\int_0^\infty\sum_{n=1}^\infty(-1)^{n-1}\log(t)\;e^{-nt}\;\mathrm{d}t\\ &=\sum_{n=1}^\infty(-1)^n\frac{\gamma+\log(n)}{n}\\ &=-\frac12\log(2)^2\tag{5} \end{align} $


More about $\mathbf{(2)}$:

The fact that $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))=\log(x)+O\left(\frac1x\right)$ relies on the log-convexity of $\Gamma(x)$; that is, $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))$ is monotonically increasing. By the recursive relation for $\Gamma(x)$, we have that $ \log(\Gamma(x))-\log(\Gamma(x-1))=\log(x-1)\tag{6} $ and that $ \log(\Gamma(x+1))-\log(\Gamma(x))=\log(x)\tag{7} $ The Mean Value Theorem and $(6)$ imply that $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(\xi_1))=\log(x{-}1)$ for some $\xi_1{\in}(x{-}1,x)$.

The Mean Value Theorem and $(7)$ imply that $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(\xi_2))=\log(x)$ for some $\xi_2{\in}(x,x{+}1)$.

By the monotonicity of $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))$, we get that $ \log(x-1)\le\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))\le\log(x)\tag{8} $ Since $\log(x)-\log(x-1)=O\left(\frac1x\right)$, $(8)$ implies that $ \frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))=\log(x)+O\left(\frac1x\right)\tag{9} $


Log-Convexity of $\mathbf{\Gamma(x)}$:

If $\frac{\mathrm{d}^2}{\mathrm{d}x^2}f(x)\ge0$, then $f$ is convex at $x$. Thus, if $\dfrac{f(x)f''(x)-f'(x)^2}{f(x)^2}=\frac{\mathrm{d}^2}{\mathrm{d}x^2}\log(f(x))\ge0$, then $f$ is log-convex. So we need to show that $\Gamma(x)\Gamma''(x)\ge\Gamma'(x)^2$. That is, $ \int_0^\infty t^{x-1}\;e^{-t}\;\mathrm{d}t \int_0^\infty\log(t)^2\;t^{x-1}\;e^{-t}\;\mathrm{d}t \ge \left(\int_0^\infty\log(t)\;t^{x-1}\;e^{-t}\;\mathrm{d}t\right)^2\tag{10} $ Dividing both sides of $(10)$ by $\int_0^\infty t^{x-1}\;e^{-t}\;\mathrm{d}t$, $(10)$ becomes $ \int\log(t)^2\;\mathrm{d}\mu \ge \left(\int\log(t)\;\mathrm{d}\mu\right)^2\tag{11} $ where $\mathrm{d}\mu=\dfrac{t^{x-1}\;e^{-t}\;\mathrm{d}t}{\int_0^\infty t^{x-1}\;e^{-t}\;\mathrm{d}t}$ is a unit measure on $[0,\infty)$. Thus, $(11)$ is simply Jensen's inequality.

Strictly speaking:

Note that $ \log(t)^2 + a^2 \ge 2a\log(t)\tag{12} $ with equality if and only if $\log(t)=a$. Integrating $(12)$ w.r.t. the unit measure $\mathrm{d}\mu$, yields $ \int\log(t)^2\;\mathrm{d}\mu + a^2 \ge 2a\int\log(t)\;\mathrm{d}\mu\tag{13} $ with equality in $(13)$ if and only if $\log(t)=a$ a.e. $\mathrm{d}\mu$. Let $a=\int\log(t)\;\mathrm{d}\mu$, then $(13)$ becomes $ \int\log(t)^2\;\mathrm{d}\mu \ge \left(\int\log(t)\;\mathrm{d}\mu\right)^2\tag{14} $ with equality if and only if $\log(t)$ is constant a.e. $\mathrm{d}\mu$. Since the $\mathrm{d}\mu$ in $(11)$ is absolutely continuous and $\log(t)$ is strictly increasing on $(0,\infty)$, the inequality in $(11)$ is strict. Therefore, $\Gamma$ is strictly log-convex.

  • 0
    @Zev: I think that Peter had intended his comment to be for [Kirill's answer](http://math.stackexchange.com/a/100588/). However, it is interesting to have a comment to my answer posted before I answered :-)2012-01-25