Let $g(x)=(1-x^2)^2$ if $x\in[-1,1]$ and $0$ for $|x|>1$. Show that $g(x)$ is differentiable and, in fact, the derivative is continuous. $g(x)$ can be seen as a differentiable "bump" that starts rising at $x=-1$, reaching a peak value of $1$ at $x=0$, and returning to $0$ at $x=1$.
Now, define $f(x)$ as:
$$f(x)=\sum_{k=1}^{\infty} \frac{1}{k}g(k^2(x-k))$$
$f(x)$ can be seen as a function which is mostly zero except with bumps with a maximum of $1/k$ in the intervals $(k-1/k^2,k+1/k^2)$, where $k$ is a positive integer.
Claim: $\lim_{x\to\infty} f(x) = 0$ and $\lim_{x\to\infty} f'(x)$ does not exist.
Proof: Now, $f(k)=\frac{1}{k}$ for all positive integers $k$, and if $x>k$, then $0\leq g(x)<\frac{1}{k}$. This shows that $f(x)\to 0$ as $x\to\infty$.
Also $f(k-1/k^2)=0$. By the mean value theorem, this means that for some $c\in (k-1/k^2,k)$, $f'(c)=\frac{f(k)-f(k-1/k^2)}{1/k^2} = k$. This shows that $f'(x)$ does not converge to zero. In fact, it is not even bounded.
However, if $f(x)$ converges to a finite value, and $f'(x)$ converges at all, then $f'(x)$ converges to zero. You can prove this again with the mean value theorem. See the proof later in the answer.
If $f(x)$ is concave or convex, then $f'(x)$ is monotonic. That means that if $f'(x)$ does not converge to any value, then for some $N$, we have $f'(x)>1$ for all $x> N$ or $f'(x)<-1$ for all $x>N$. This is easily shown to not be possible if $f(x)\to L$, since it would mean that $|f(x+1)-f(x)|>1$ for $x>N$.
This means that $f'(x)$ converges, and thus it must converge to $0$.
Originally, I wrote that it was true if $f(x)$ was monotonic, but I don't think that's so.
If we define $h(x)=\sum_{k=1}^{\infty} g(k^3(x-k))$ and $f(x)=\int_0^x h(t)\,dt$ then $f'(x)=h(x)$ does not converge to zero, but $f(x)$ is monotonically increasing, and $f(x)$ is bounded by $2\sum_k \frac{1}{k^2}$, so $f(x)$ converge, but $f'(x)$ does not converge.
We could even make $h(x)$ unbounded by making it $h(x)=\sum_{k=1}^{\infty} kg(k^4(x-k))$.
If you want a counterexample with $f$ strictly increasing, use $1-e^{-x} + \int_{0}^x h(t)\,dt$.
One last thing I didn't prove: If $f'(x)$ and $f(x)$ both converge to finite values, then $f'(x)$ converges to $0$.
The key is the result:
Lemma: If $f$ is differentiable and $\lim_{x\to\infty} f(x)=L$, then $\liminf_{x\to\infty} |f'(x)|=0$.
Proof:
This follows from the mean value theorem.
Since $f(x)\to L$, here is an $N$ so that for $x>N$, $|f(x)-L|<1/2$. Thus, for $x>N$, and any $D>0$, $|f(x+D)-f(x)|<1$.
Now, given any $\epsilon>0$ and any $M$ pick $x>\max(M,N)$ and set $D=\frac{1}{\epsilon}$.
By the mean value theorem, there must be a $c\in[x,x+D]$ such that $$f'(c)=\frac{f(x+D)-f(x)}{D}=\epsilon(f(x+1/\epsilon)-f(x))$$ and thus $|f'(c)|<\epsilon$.
So $\liminf_{x\to\infty} |f'(x)|=0$.
Corollary:
If $f(x)$ and $f'(x)$ converge to finite values as $x\to\infty$, then $\lim_{x\to\infty} f'(x)=0$.
Proof: Since $\lim_{x\to\infty} |f'(x)|=|\lim_{x\to\infty} f'(x)|$, and $\liminf |f'(x)|=\lim |f'(x)|$ when the right hand exists, you have the result.
Finally, if $f(x)\to\infty$ you can get any values for $\alpha=\liminf |f'(x)|$. For example, $f(x)=\alpha x$ for $\alpha\neq 0$ give $f'(x)=\alpha$. $$f(x)=\begin{cases}\log x&x\geq1\\
x-1&x<1\end{cases}$$
has $f(x)\to\infty$ as $x\to\infty$ but $f'(x)\to 0$.
For concave $f$, you always have $\lim f'(x)$ in $[-\infty,\infty]$, but without concavity, you can get any values $\alpha\leq\beta$ for $\alpha=\liminf f'(x)$ and $\limsup f'(x)$.