Let $U\subset \mathbb R^m$ and $V\subset \mathbb R ^n$ be an open sets and $f:U\to \mathbb R^n$ a Lipschitz map and $g:V\to \mathbb R^p$ a differentiable function. Suppose $f(U)\subset V$ and $b=f(a)$.
I want to prove that If $g'(b)=0$, then $g\circ f:U\to \mathbb R^p$ is differentiable at the point $a$ with $(g\circ f)'(a)=0$.
If $f$ were differentiable, this question would be just a straightforward application of the chain rule. Instead of this $f$ is just Lipschitz, i.e., there is a constant $c\in \mathbb R$ such that for every $x,y\in U$
$$|f(x)-f(y)|\le c|x-y|$$
In another words, for every $x\in U$ and a sufficiently small $u\in \mathbb R^m$ we have:
$$|f(x+u)-f(x)|\le c|u|$$
We also know that $\frac{g(b+v)-g(b)}{|v|}\to 0$, as $v\to 0$ and we want to prove that $\lim_{v\to 0}\frac{g(f(a+v))-g(f(a))}{|v|}=0$.
My attempt was trying to use triangle inequality and the squeeze theorem but I didn't see any relation between the absolute value of these expressions above.