11
$\begingroup$

I came across this question

Does there exist a function $f: \mathbb{R}^+ \to \mathbb{R}^+$ such that $f(x+y)>f(x)(1+yf(x))$ and $x,y\in \mathbb{R}^+$

and I didn't know how to begin on it.

  • 0
    Actually, since we can show that$f(x)$is monotone, by f(x+y)-f(x)>y$f(x)^2$>0 , it must be a.e. differentiable, and so a.e. continuous.2011-08-01

3 Answers 3

9

Here's a slightly different approach.

Suppose for contradiction that such an $f$ exists. In fact we will reach the same contradiction even if $f$ only satisfies $f(x+y) \geq f(x)(1+yf(x))$ for $x,y>0$.

Define a sequence $x_0 < x_1 < x_2 < \ldots$ via the recurrence $x_0 := 2011 \text{ and } x_{n+1} := x_n + 1/f(x_n) \text{ for } n>0.$

Making repeated use of the bound $f(x + 1/f(x)) \geq f(x)(1+f(x)/f(x)) = 2f(x)$, we find that $f(x_n) \geq 2^n f(x_0)$ for all $n$.

Taking the reciprocal of the latter inequality and summing a geometric series enables us to establish $ x_n = x_0 + \sum_{i=0}^{n-1} (x_{i+1} - x_i) = x_0 + \sum_{i=0}^{n-1} 1/f(x_i) \leq x_0 + \sum_{i=0}^{n-1} 1/(2^i f(x_0)) < x_0 + 2/f(x_0) =: M.$

But now, the sequence $x_0,x_1,\ldots$ is bounded (by $M$) while the sequence $f(x_0),f(x_1),\ldots$ diverges to infinity. It follows that there is some integer $N$ such that $f(x_N) > f(M)$ even though $x_N < M$. However, it has been observed such an $f$ must be monotone, so we have a contradiction.

  • 0
    @picakhu: I'm glad you're happy with it! It occurred to me late at night, a mosquito had woken me up >:[, that the inequality could be interpreted as saying "increasing $x$ by $y$ results in multiplying the value of $f$ at $x$ by at least $1+yf(x)$". Then I just worked out what $y$ would be needed if one wished to at least double the value, and the rest was easy!2011-08-02
5

There exists no such $f$. My solution is formally not based on analysis, but it is inspired by my analysis "solution" in the comments. Hopefully there's no bug here :-).

Pick a sufficiently small $\epsilon > 0$. Then, for $n \in \mathbb N_{0}$, we have: $ f(1+(n+1)\epsilon) - f(1+n\epsilon) > \epsilon f(1+ n\epsilon)^2. $ I find it easier to work with a related sequence $u_n$ defined as $u_n = \epsilon f(1+n\epsilon)$. Plugging this in the above equation, we get the pleasant recursive inequality $u_{n+1} > u_n + u_n^2$, with the base condition $u_0 = \epsilon f(1)$.

Proposition. If $u_k \geq \eta$, then $u_{k + \lceil 1/\eta \rceil} \geq 2 \eta$.

Proof. We have $u_{k+1} - u_k > \eta^2$, $u_{k+2}-u_{k+1} > \eta^2$, and so on. (I am implicitly using the fact that $u_n$ is monotonic.) Adding all these $\lceil 1/\eta \rceil$ inequalities, we get the claim. $\Box$

Number of iterations before reaching $1$. For $\eta \leq 1$, we have the simplification $\lceil 1/\eta \rceil \leq 2/\eta$. Also I find it convenient to use the informal language "$u$-value at iteration $i$" to denote $u_i$. The above observation says that if the $u$-value at the current iteration is at least $\eta \leq 1$, then after at most $2/\eta$ iterations, the $u$-value becomes at least $2\eta$. Now, if $2\eta \leq 1$, then we stop; else, after an additional $2/(2\eta)$ iterations, the $u$-value is at least $2^2\eta$. Using this argument repeatedly, the number of iterations before the $u$-value becomes at least $1$ is at most $ \frac{2}{\eta} + \frac{2}{2\eta} + \frac{2}{2^2 \eta} + \ldots $ (this is actually a finite sum with $\approx \log(1/\eta)$ terms), which is smaller than the sum of the corresponding infinite geometric series, namely $4/\eta$.

Final contradiction. Now, since $u_0 = \epsilon f(1) \stackrel{def}{=} \eta$, we have $ 1 \leq u_{\frac{4}{\epsilon f(1)}} = \epsilon \cdot f\left(1+\epsilon \cdot \frac{4}{\epsilon f(1)} \right) = \epsilon \cdot f\left(1+\frac{4}{f(1)} \right). $ This is a contradiction since the $\epsilon > 0$ in the above argument was arbitrary, whereas the second factor is a fixed positive quantity depending only on $f$.

Note. While no $f$ exists satisfying the requirements of the problem, we can do better if we relax the domain to some interval of the form $(0,A)$. In this case, just the function $f(x) = \frac{1}{A-x}$ works.

  • 0
    @Mike corrected, thanks!2011-08-01
4

Another approach: Rewrite the inequality as $(*): f(x+y)-f(x)>yf(x)^2$. So we see that $f$ is a strictly increasing function. Fix $x$ and let $y\rightarrow+\infty$, we also see that $f$ is unbounded. Hence we can pick a number $a$ such that $f(a)>1$.

Now, for any $x\ge 0$, define $g(x)=f(x+a)-f(a)$. Clearly $g(0)=0$ and $g$ is strictly increasing. Also, it is easy to show that $(**): g(x+y) - g(x)>yg(x)^2$ whenever $x\ge 0$ and $y>0$. So, the inequalities for $f$ and $g$ are similar, except that we allow $x=0$ in the inequality for $g$. However, the function $g$ has a nicer property -- by putting $x=a$ in $(*)$, we have $g(y)>y$ for all $y>0$.

Let $b>0$ and $h=b/n$. By $(**)$, we get $g(ih)-g((i-1)h)>hg((i-1)h)^2$. Summing up for $i=1,2,...,n$ and let $h\rightarrow 0$, we see that $g(b) >\int_0^b g(u)^2 du$. As $g$ is an increasing function, the Riemann integral exists. Now, apply the inequality $g(u)>u$ to the integrand, we get $g(b)>\int_0^b g(u)^2 du > \int_0^b u^2 du = \frac{b^3}{3}.$ Apply this to the integrand, we get $g(b)>\int_0^b g(u)^2 du > \int_0^b \left(\frac{u^3}{3}\right)^2 du = \frac{b^7}{3^2 \times 7}.$ Continue in this manner, we get $ g(b) > \frac{b^3}{3},\ \frac{b^7}{3^2 \times 7},\ \frac{b^{15}}{3^4 \times 7^2 \times 15}, ... $ The $n$-th term of the above sequence is of order $O(b^{2^{n+1}}/2^{n^3})$. Therefore, when $b>1$, the value of $g(b)$ will eventually blow up, i.e. $f(a+b)$ does not exist.

The final step above suggests that when the domain of $f$ is restricted to a small neighbourhood of zero, $g$ may not blow up [edit: or even the neighbourhood is so small that $a$ and hence $g$ are not defined] and hence a solution for $f$ can exist. For example, if $f$ is only defined on some $(0, b)$, we may take $f(x)=e^{\alpha x}$ as a solution provided that $b$ is small and $\alpha > e^{\alpha b}$.

  • 0
    Supposing $f$ is continuous at $y$ and taking $x\to 0$ from above shows $f(y) \geq A e^{Ay}$ where $A := \lim_{x \to 0^+} f(x)$. We can drop the assumption that $f$ is continuous at $y$ by considering an increasing sequence of points $y_n$ at which $f$ is continuous with $y_n \to y$. This is of course possible since $f$ is monotone and so has at most countably many discontinuities. The conclusion is that $f(x) \geq Ae^{Ax}$ holds for all x>0 so the "partial examples" in your answer are minimal in some sense.2011-08-02