Here is my attempt; however, it does not yield any simple formula. The idea behind is simple: if it is not simple to ensure that a sum (with varying, but non-negative, coefficients) of incrasing functions is increasing, a sum (with varying, but non-negative, coefficients) of non-negative functions is always non-negative.
We need the following natural assumption (which is a necessary condition):
f(a) + (b-a) g'(b) > g (b) > f(a) + (b-a) f'(a).
Let $\varepsilon \in (0, b-a)$. Let $\eta_{1, \varepsilon}$ and $\eta_{2, \varepsilon}$ be $\mathcal{C}^\infty$ functions from $\mathbb{R}_+^*$ to $[0,1]$ such that:
- $\eta_{1, \varepsilon} \equiv 1$ on $(0,a]$;
- $\eta_{1, \varepsilon} \equiv 0$ on $[a+\varepsilon,+ \infty)$;
- $\eta_{2, \varepsilon} \equiv 0$ on $(0,b-\varepsilon]$;
- $\eta_{2, \varepsilon} \equiv 1$ on $[b,+ \infty)$.
Let us put H_\varepsilon := \eta_{1, \varepsilon} f'' + \eta_{2, \varepsilon} g''. Let $h_\epsilon$ be the unique second antiderivative of $H_\varepsilon$ such that $h \equiv f$ on $(0,a]$; such a function exists since h_\epsilon'' = f'' on $(0,a]$. Then $h_\epsilon$ is convex, since its second derivative is non-negative. We now have to modify it so as to make it coincide with $g$ on $[b,+ \infty)$.
First, notice that |h_\epsilon' (b) - f' (a)| \leq (\sup_{[a,b]} f'' + sup_{[a,b]} g'') \varepsilon, and that |h_\epsilon (b) - f (a) - f'(a) (b-a)| \leq (\sup_{[a,b]} f'' + sup_{[a,b]} g'') (b-a) \varepsilon. Hence, we can take $\varepsilon$ small enough that $h_\epsilon (b) < g(b)$ and h_\epsilon' (b) < g'(b).
We will obtain our final function $h$ by adding a small bump at the right place to h_\epsilon''. We must add g'(b) - h_\epsilon' (b) to the derivative of h_\epsilon'. Let $\delta > 0$, and let $\nu_\delta$ be a smooth, non-negative bump function, supported on $[-\delta, \delta]$, and whose total mass is g'(b) - h_\epsilon' (b). For $x_0$ in $(a+\delta, b-\delta)$, let $h_{\varepsilon, \delta, x_0}$ be the unique second antiderivative of $H_\epsilon + \nu_\delta (\cdot - x_0)$ which coincides with $f$ on $(0,a]$. Then $h_{\varepsilon, \delta, x_0}$ is convex, and h_{\varepsilon, \delta, x_0}' (b) = g'(b). For small enough $\varepsilon$ and $\delta$, we can ensure that $h_{\varepsilon, \delta, x_0} (b) < g(b)$ if $x_0$ is close enough to $b$, and that $h_{\varepsilon, \delta, x_0} (b) > g(b)$ if $x_0$ is close enough to $a$ (the later uses the assumption f(a) + (b-a) g'(b) > g (b)).
Since the function $x_0 \mapsto h_{\varepsilon, \delta, x_0} (b)$ is continuous, by the intermediate value theorem, there exists a value $x$ of $x_0$ such that $h_{\varepsilon, \delta, x} (b) = g(b)$. Then $h_{\varepsilon, \delta, x}$ is smooth, convex, equal to $f$ on $(0,a)$ and equal to $g$ on $(b, + \infty)$.