0
$\begingroup$

Let $f,g:(0,\infty)\to(0,\infty)$ be smooth ($C^\infty$), decreasing and convex, $0 and $f(a)>g(b)$, f'(a). I'm looking for a function $h:(0,\infty)\to (0,\infty)$ which is smooth, decreasing and convex, and $h(x)=f(x)$ for $x\in (0,a)$, and $h(x)=g(x)$ for $x\in (b,\infty)$.

I have no doubt that such a function exists (it's "obvious" by drawing a picture) but somehow I don't manage to find a simple formula for it. By that I mean something along the lines of: let $\eta$ be a smooth function which takes the values 0 in $(0,a)$ and 1 in $(b,\infty)$, and then $h=(1-\eta)f+\eta g$; of course this does not work since this $h$ will not be decreasing and convex in general. So, what is the simplest formula that works?

PS: I would be satisfied if "decreasing" and "convex" are replaced by "first derivatives negative", "second derivatives positive".

Edit: In addition we assume that (-f'(a))(b-a)>f(a)-g(b) (it doesn't work if this is not satisfied).

  • 0
    Do you want a proof that such a function exists (I may have one), or a simple formula for it?2012-03-08

2 Answers 2

1

Here is my attempt; however, it does not yield any simple formula. The idea behind is simple: if it is not simple to ensure that a sum (with varying, but non-negative, coefficients) of incrasing functions is increasing, a sum (with varying, but non-negative, coefficients) of non-negative functions is always non-negative.

We need the following natural assumption (which is a necessary condition):

f(a) + (b-a) g'(b) > g (b) > f(a) + (b-a) f'(a).

Let $\varepsilon \in (0, b-a)$. Let $\eta_{1, \varepsilon}$ and $\eta_{2, \varepsilon}$ be $\mathcal{C}^\infty$ functions from $\mathbb{R}_+^*$ to $[0,1]$ such that:

  • $\eta_{1, \varepsilon} \equiv 1$ on $(0,a]$;
  • $\eta_{1, \varepsilon} \equiv 0$ on $[a+\varepsilon,+ \infty)$;
  • $\eta_{2, \varepsilon} \equiv 0$ on $(0,b-\varepsilon]$;
  • $\eta_{2, \varepsilon} \equiv 1$ on $[b,+ \infty)$.

Let us put H_\varepsilon := \eta_{1, \varepsilon} f'' + \eta_{2, \varepsilon} g''. Let $h_\epsilon$ be the unique second antiderivative of $H_\varepsilon$ such that $h \equiv f$ on $(0,a]$; such a function exists since h_\epsilon'' = f'' on $(0,a]$. Then $h_\epsilon$ is convex, since its second derivative is non-negative. We now have to modify it so as to make it coincide with $g$ on $[b,+ \infty)$.

First, notice that |h_\epsilon' (b) - f' (a)| \leq (\sup_{[a,b]} f'' + sup_{[a,b]} g'') \varepsilon, and that |h_\epsilon (b) - f (a) - f'(a) (b-a)| \leq (\sup_{[a,b]} f'' + sup_{[a,b]} g'') (b-a) \varepsilon. Hence, we can take $\varepsilon$ small enough that $h_\epsilon (b) < g(b)$ and h_\epsilon' (b) < g'(b).

We will obtain our final function $h$ by adding a small bump at the right place to h_\epsilon''. We must add g'(b) - h_\epsilon' (b) to the derivative of h_\epsilon'. Let $\delta > 0$, and let $\nu_\delta$ be a smooth, non-negative bump function, supported on $[-\delta, \delta]$, and whose total mass is g'(b) - h_\epsilon' (b). For $x_0$ in $(a+\delta, b-\delta)$, let $h_{\varepsilon, \delta, x_0}$ be the unique second antiderivative of $H_\epsilon + \nu_\delta (\cdot - x_0)$ which coincides with $f$ on $(0,a]$. Then $h_{\varepsilon, \delta, x_0}$ is convex, and h_{\varepsilon, \delta, x_0}' (b) = g'(b). For small enough $\varepsilon$ and $\delta$, we can ensure that $h_{\varepsilon, \delta, x_0} (b) < g(b)$ if $x_0$ is close enough to $b$, and that $h_{\varepsilon, \delta, x_0} (b) > g(b)$ if $x_0$ is close enough to $a$ (the later uses the assumption f(a) + (b-a) g'(b) > g (b)).

Since the function $x_0 \mapsto h_{\varepsilon, \delta, x_0} (b)$ is continuous, by the intermediate value theorem, there exists a value $x$ of $x_0$ such that $h_{\varepsilon, \delta, x} (b) = g(b)$. Then $h_{\varepsilon, \delta, x}$ is smooth, convex, equal to $f$ on $(0,a)$ and equal to $g$ on $(b, + \infty)$.

0

This below is my attempt that failed, still someone may be able to fix it, or it may give you some ideas (not necessarily correct...). Let $F_a^b(x) = \exp\left(-\dfrac{1}{(x-a)(x-b)}\right) \cdot \chi_{[a,b]}(x)\,.$ and $G_a^b(x) = \frac{\int_a^x F_a^b(y)\ dy}{\int_a^b F_a^b(y)\ dy}$ be our main tool functions, and below are some auxiliaries. The S-shape function: H_2(x) = G_0^1(x)(g'(b)-f'(a)) - f'(a) the S-shape adjusted for $f(a)-g(b)$ with $\alpha$ and $\beta$ such that $\int_a^b H_3(x)\ dx = g(b) - f(a)$, there exists such numbers because of |f'(a)|(b-a) > f(a) - g(b): $H_3(x) = H_2(\alpha x + \beta)$ and finally our intermediate function: $H_4(x) = \int_a^x H_3(y)\ dy + f(a)$

Let $z_i = \frac{(3-i)a+ib}{3}$. After gluing all together the final results looks like:

$h(x) = (1-G_{z_0}^{z_1}(x))f(a) + (G_{z_0}^{z_1}(x)-G_{z_2}^{z_3}(x))H_4(x) + G_{z_2}^{z_3}(x)g(x) $

I am not sure that every formula is correct (there may be some typos), but the main idea should be understandable. The sloppy point is that this weighted sum may not be convex, there are some conditions when it is true, but generally unfortunately no. Hope that helps, even if just a bit ;-)

Edit: It seems that if there would be a' and b' with similar properties as $a$ and $b$, then using those instead of $a$ and $b$ and setting z_0 = a'-\varepsilon, z_1 = a'+\varepsilon, z_2 = b'-\varepsilon and z_3 = b'+\varepsilon might just work ;-)