1
$\begingroup$

Does $\frac{f(x+1)}{f(x)}$ increasing in $x$ imply that $f(x)$ is decreasing in $x$?

I somewhat feel that this should be the case, but don't see how to show it.

  • 3
    Your function wouldn't notice if we did something like multiplied $f(x)$ by, say $(2+\sin(2\pi x))$ or any other function with $1$ as a period and no roots (For instance, $-1$, as pointed out in an answer). This poses a problem for any claims of increasing or decreasingness.2017-01-07

2 Answers 2

5

For $f(x)=1/x$ and $g(x)=-1/x$, we have $\frac{f(x+1)}{f(x)}=\frac{g(x+1)}{g(x)}=\frac{x}{x+1}$ increasing to $1$, but $f$ is decreasing while $g$ is increasing.

3

No, you just need $f$ to increase very fast. Assuming the domain $x \ge 0$ is enough, try $f(x)=e^{x^2}$, for example. $\frac {f(x+1)}{f(x)}=\frac {e^{(x+1)^2}}{e^{x^2}}=e^{2x+1}$