2
$\begingroup$

Let $\{\alpha_n\}_{n=1}^\infty$ be a sequence of positive real numbers such that $\sum_{n=1}^\infty \alpha_n<\infty$. In particular, $\lim_n \alpha_n=0$.

Let $\{\varepsilon_n\}_{n=1}^{\infty}\subset (0,1)$ be a decreasing sequence of numbers such that $\lim_n \varepsilon_n = 0$. Define the function

$$f(x) := \sum_{n=1}^{\infty} \alpha_n x^{\varepsilon_n}.$$

The series above converges uniformy on compacts of $\mathbb{R}$.

Consider the integral on an interval $[a,b]\subset [0,\infty)$:

$$\int_a^b f(x)^\beta dx = \int_a^b \left( \sum_{n=1}^{\infty} \alpha_n x^{\varepsilon_n} \right)^\beta dx.$$

I claim that the above integral converges for all $\beta\in \mathbb{R}$.

1) If $\beta\geq 0$ then $f$ is continuous and hence integrable on the compact $[a,b]$.

2) If $\beta:= -\gamma$, $\gamma>0$ then we may use the estimate $$\left( \sum_{n=1}^{\infty} \alpha_n x^{\varepsilon_n}\right)^{-\gamma} \leq (\alpha_{n_0})^{-\gamma} x^{-\gamma \varepsilon_{n_0}}$$ for any $n_0\geq 1$. Hence, $$\int_a^b \left( \sum_{n=1}^{\infty} \alpha_n x^{\varepsilon_n} \right)^{-\gamma} dx\leq \alpha_{n_0}^{-\gamma} \int_a^b x^{-\gamma \varepsilon_{n_0}} dx = \alpha_{n_0}^{-\gamma} \frac{b^{1-\gamma \varepsilon_{n_0}}- a^{1-\gamma \varepsilon_{n_0}}}{1-\gamma \varepsilon_{n_0}},$$ where we choose $n_0\geq 1$ so that $1-\gamma \varepsilon_{n_0}>0$.

Finally taking limit on the quotient we have

$$\int_a^b \left( \sum_{n=1}^{\infty} \alpha_n x^{\varepsilon_n} \right)^{-\gamma} dx\leq \alpha_{n_0}^{-\gamma} (b- a).$$

Are all these arguments correct and is it true that this integral converges always? Of course, as $n_0$ grows, then $\alpha_{n_0}^{-\gamma}$ gets larger, but $n_0$ is fixed.

Thanks a lot for you help! :)

  • 0
    There is trouble with $x^{\epsilon_n}$ if $x<0.$2017-02-19
  • 0
    Sorry, $x$ should be positive, actually $a>0$ in the integral. Thanks for the observation, I will edit that. Besides that the argument seems correct right?2017-02-20
  • 1
    I would think $a=0$ would be the case of interest here. After all, $f$ is positive and continuous on any $[a,b]$ with $a>0,$ hence so is $f^\beta$ for every $\beta \in \mathbb R.$ The convergence of $\int_a^bf^\beta$ is then trivial.2017-02-20

1 Answers 1

0

I think you've done a good job with the main ideas. Everything seems to be essentially correct. I would change a few things, add a few things.

Since $x^{\epsilon_n}$ is problematic if $x<0,$ I would restrict the domain to $[0,\infty).$ Then we can say that for $x\in [0,b]$ and $n\in \mathbb N,$

$$|\alpha_nx^{\epsilon_n}| = \alpha_nx^{\epsilon_n}\le \alpha_n+ \alpha_nx^{\epsilon_0} \le \alpha_n+ \alpha_nb^{\epsilon_0}=\alpha_n(1+b^{\epsilon_0}).$$

Because $\sum_n \alpha_n <\infty,$ the series defining $f$ converges uniformly on $[0,b]$ by the Weierstrass M test. Each summand $\alpha_nx^{\epsilon_n}$ is continuous on $[0,b],$ and therefore so is $f.$ Because $b$ is arbitrary, $f$ is continuous on $[0,\infty).$ Note that $f(0)=0$ and $f>0$ on $(0,\infty).$

As for convergence of the integrals $\int_a^bf^\beta,$ note there is nothing to prove if $a>0,$ since $f^\beta$ is continuous on such an interval for any $\beta \in \mathbb R.$ The crux of the matter are the integrals $\int_0^1f^\beta.$ If $\beta \ge 0,$ then $f^\beta$ is continuous on $[0,1]$ and there's no problem. So assume $\beta < 0.$ We'll save work by recalling that $\int_0^1 x^p\,dx$ converges iff $p>-1.$ Now as you observed, we can choose $n_0$ such that $-|\beta|\epsilon_{n_0}>-1.$ Because $f(x) > \alpha_{n_0}x^{\epsilon_{n_0}},$ we have

$$f^\beta(x) = f^{-|\beta|}(x) < (\alpha_{n_0}x^{\epsilon_{n_0}})^{-|\beta|} = \alpha_{n_0}^{-|\beta|}x^{-|\beta|\epsilon_{n_0}}.$$

Because $-|\beta|\epsilon_{n_0}>-1,$ $\int_0^1 x^{-|\beta|\epsilon_{n_0}}\,dx$ converges and we're done.

  • 0
    I agree with absolutely everything, and surely the case of interest was $a=0$ (and $\beta<0$) :) that's the problem when we try to generalize questions so much that we sometimes miss the point. It is a really funny phenomenon in the sense that, the same argument works to say that $\int_0^1 \frac{1}{x+\sqrt{x}}dx \leq \int_0^1 \frac{1}{\sqrt{x}} dx<\infty,$ which, at least to me, seems counterintuitive :) (although at infinity it is not any longer counterintuitive) $\int_1^\infty \frac{1}{x^2+x}dx<\infty$ :) Anyway, thanks a lot!!2017-02-21
  • 0
    You're welcome. I see you've constructed a continuous function on $[0,\infty)$ that is positive on $(0,\infty)$ and $0$ at $0,$ while $\int_0^1f^\beta < \infty$ for every real $\beta.$ Was that the main goal behind the scenes here?2017-02-22
  • 0
    Exactly, and that the function has this property $\int_0^1 f^\beta<\infty$ for every $b<0$ is so crucial that I wanted to be 100% sure, and I really think the arguments are correct although it is a quite exotic function, hehe, because it's really integrable at 0! The goal behind this is to study a new type of stochastic noise that, when added to ODE's, it makes the associated flow $C^\infty$ even if the vector field is non-Lipschitz ^^2017-02-22
  • 0
    @Martingalo Define a function $f(x):= \frac{1}{1+|\log x|}$, and $f(0):=0$. Then $f$ is continuous and positive on $[0,\infty)$, and $\int_0^1 f^{\beta}<\infty$ for all $\beta$. So I don't really get the point of this construction.2017-02-25