There is an exercise in Stephen Abbott's Understanding Analysis that states:
Exercise 5.3.7. (b) Show that the function $g(x)=\begin{cases} x/2+x^2\sin(1/x)&\text{ if }x\neq0\\\ 0&\text{ if }x=0 \end{cases}$ is differentiable on $\mathbb{R}$ and satisfies g'(0)\geq0. Now, prove that $g$ is not increasing over any open interval containing $0$.
First of all, I know that for $x=0$, g'(0)=\lim_{x\to0}\frac{x/2+x^2\sin(1/x)}{x}=\lim_{x\to0}\left[\frac{1}{2}+x\sin\left(\frac{1}{x}\right)\right]=\frac{1}{2}\geq0, and for $x\neq0$, g'(x)=\frac{1}{2}+2x\sin\left(\frac{1}{x}\right)-\cos\left(\frac{1}{x}\right). Hence, g'(x)=\begin{cases} 1/2+2x\sin(1/x)-\cos(1/x)&\text{ if }x\neq0\\\ 1/2&\text{ if }x=0, \end{cases} and $g$ is differentiable on $\mathbb{R}$.
However, I do not know how to formally show that if $(a,b)$ is an open interval containing $0$, then $g$ is not increasing on it; the idea I have is to keep in mind that as $x$ approaches $0$, it oscillates 'faster,' and you can thus always find two different points $x,y\in(a,b)$ such that g'(x)>0 and g'(y)<0. Is this a valid assertion? If so, how can I go about showing it? Thanks in advance.