2
$\begingroup$

Help, I've been stuck with this for hours, so far I've tried expanding the $\alpha$ integral using the definition of upper and lower integrals U and L but it doesn't seem to be a good way.

Let be $\alpha,\ f,\ g\ :[a,b]\to\mathbb{R}$ continous, $\alpha$ non- decreasing and $f(x) \ge 0$.

Let be $\beta(x) = \int_a^x f\, d\alpha$.

Show $\int_a^b g\ d\beta = \int_a^b gf\, d\alpha$.

  • 0
    There is something strange. The right hand side of the definition of $\beta$ does not make sense since it is constant. Do you want to calculate the integral with respect to $\mathrm d \beta$?2017-02-10
  • 0
    I think he intended $\beta(x)=\int_a^x f\,d\alpha$.2017-02-10
  • 0
    Nick was right, thank you2017-02-10
  • 0
    Prove this for step functions then extend via Arzelà theorem (Bounded convergence theorem)2017-02-10
  • 0
    In fact, you don't need Arzelà theorem. Just see my proof below (which was edited some times).2017-02-10
  • 0
    Good theoretical question. +12017-02-11

2 Answers 2

2

Proof:

With $I = \int_a^b gf \, d\alpha$, apply the mean value theorem for integrals (since $f$ is continuous and $\alpha$ is non-decreasing) to a Riemann-Stieltjes sum. There exists $\eta_j \in (x_{j-1},x_j)$ for all $j$ such that

$$\beta(x_j) - \beta(x_{j-1}) = \int_{x_{j-1}}^{x_j} f \, d \alpha = f(\eta_j)(\alpha(x_j) - \alpha(x_{j-1})) ,$$

and

$$\left|\sum_{j=1}^n g(\xi_j)(\beta(x_j) - \beta(x_{j-1})) - I\right| \\ = \left|\sum_{j=1}^n g(\xi_j)f(\eta_j)(\alpha(x_j) - \alpha(x_{j-1}))- I \right|\\ \leqslant \left|\sum_{j=1}^n g(\xi_j)f(\xi_j)(\alpha(x_j) - \alpha(x_{j-1}))- I \right| + \left|\sum_{j=1}^n g(\xi_j)(f(\eta_j)-f(\xi_j))(\alpha(x_j) - \alpha(x_{j-1}))\right|. $$

For all sufficiently fine partitions the first term on the right-hand side is smaller than $\epsilon/2$ since $I = \int_a^b gf d\alpha$ exists.

The second term on the right-hand side is also smaller than $\epsilon/2$ with sufficiently fine partitions since $f$ is uniformly continuous on $[a,b]$, $g$ is bounded, and $\alpha$ is non-decreasing. We have $|g(x)| \leqslant M$ for $x \in [a,b]$ and $\delta > 0$ such that if $|x-y| < \delta$ then $|f(x) - f(y)| < \epsilon/(2M (\alpha(b) - \alpha(a)))$ for all $x,y \in [a,b]$. Hence, if the partition norm is less than $\delta$, then

$$\left|\sum_{j=1}^n g(\xi_j)(f(\eta_j)-f(\xi_j))(\alpha(x_j) - \alpha(x_{j-1}))\right| \leqslant \sum_{j=1}^n |g(\xi_j)||f(\eta_j)-f(\xi_j)||\alpha(x_j) - \alpha(x_{j-1})| \\ \leqslant M(\alpha(b) - \alpha(a))\frac{\epsilon}{2M(\alpha(b)-\alpha(a))} \\ = \frac{\epsilon}{2}.$$

Thus,

$$\int_a^b g d \beta = I = \int_a^b gf d \alpha.$$

  • 1
    There is some issue here which can perhaps be fixed. You seem to assume $\alpha(x) =x$ because apparently the function $\alpha$ is not being used here.2017-02-11
  • 0
    @Paramanand Singh: You are absolutely correct. I was too hasty. I will fix it. Thanks.2017-02-11
  • 1
    I think it's a minor fix. $\beta$ is not differentiable yet using mean value theorem for integrals we can write $$\beta(x_{j}) - \beta(x_{j-1})=f(\eta_{j})(\alpha(x_{j})-\alpha(x_{j-1}))$$2017-02-11
  • 0
    @ParamanandSingh: Thanks again!2017-02-11
  • 0
    My pending +1 delivered.2017-02-11
3

First suppose $g$ is a step function, that is, ones can write $g=\sum_{i=1}^k\alpha_i\chi_{A_i}$, where $A_i=[a_i, b_i]$ and $[a,b]=\bigcup_{i=1}^kA_i$. So

$$\int_a^b g d\beta=\sum_{i=1}^k\alpha_i[\beta(b_i)-\beta(a_i)]=\sum_{i=1}^k\alpha_i\int_{a_i}^{b_i}fd\alpha=$$

$$=\sum_{i=1}^k\int_{a_i}^{b_i}\alpha_ifd\alpha=\int_a^b\sum_{i=1}^k\alpha_i\chi_{A_i}fd\alpha=\int_a^b gfd\alpha.$$

So the statement is valid for step functions.

Now let $g$ continuous and let $(s_n)$ a sequence of step functions that converges uniformly to $g$ (this can be done since $g$ is continuous on the compact $[a,b]$). Since the convergence is uniform, you can pass the limit under the integral sign and get $$\lim\int_a^bs_nd\beta = \int_a^bgd\beta$$ and $$\lim\int_a^bs_nfd\alpha = \int_a^bgfd\alpha.$$ So it follows from what we just proved for step functions that $$\int_a^bgd\beta = \lim\int_a^bs_nd\beta = \lim\int s_nfd\alpha = \int gfd\alpha.$$

  • 0
    From your answer it appears that $f\geq 0$ is not needed. The condition on $f$ was perhaps given only to ensure that $\beta$ is non-decreasing and perhaps a proof based on upper lower sums could be given using this assumption.2017-02-11
  • 0
    @ParamanandSingh Did you meant $g$? Yes, you are right. I will edit the answer.2017-02-11
  • 0
    Actually I meant $f$ because that condition is in the question. But yes your edit is OK because there is no need to consider positive negative parts of $g$. The logic works for any continuous $g$. The condition for $f$ given in question is unnecessary as far as your method of proof goes. I am saying that perhaps the condition on $f$ was given to suggest a proof which does not use advanced theorems like Arzela. BTW you had my +1 from the start.2017-02-11
  • 0
    @ParamanandSingh Thank you! ps.: You can notice that I removed the application of Arzelà's theorem, and yet we don't need $f\geq 0$ (in fact, I only notice this hypothesis after you said).2017-02-11
  • 0
    There appears to be no way to give another +1 for the improvement you have done.2017-02-11