Does $\frac{f(x+1)}{f(x)}$ increasing in $x$ imply that $f(x)$ is decreasing in $x$?
I somewhat feel that this should be the case, but don't see how to show it.
Does $\frac{f(x+1)}{f(x)}$ increasing in $x$ imply that $f(x)$ is decreasing in $x$?
I somewhat feel that this should be the case, but don't see how to show it.
For $f(x)=1/x$ and $g(x)=-1/x$, we have $\frac{f(x+1)}{f(x)}=\frac{g(x+1)}{g(x)}=\frac{x}{x+1}$ increasing to $1$, but $f$ is decreasing while $g$ is increasing.
No, you just need $f$ to increase very fast. Assuming the domain $x \ge 0$ is enough, try $f(x)=e^{x^2}$, for example. $\frac {f(x+1)}{f(x)}=\frac {e^{(x+1)^2}}{e^{x^2}}=e^{2x+1}$