Suppose I know that a function $f$ is strictly increasing on an interval $(a, b)$ and continuous at $a$ and $b$. Can I show that $a < x$ and $b > x$ for all $x \in (a, b)$ using the following argument?
Suppose there is a point $\gamma$ such that $f(a) > f(\gamma)$. By continuity of $f$ at $a$, we know that there exists some $\delta > 0$ such that $f(x) > f(\gamma)$ for all $x$ in $(a, a + \delta)$. For some such $x$ we have $x < \gamma$ and $f(x) > f(\gamma)$. This is a contradiction, so we must have $f(a) \leq f(\gamma)$ for all $\gamma \in (a, b)$.
Now suppose that there was some $\mu \in (a, b)$ such that $f(\mu) = f(a)$. Then there exists some $\mu'$ such that $a < \mu' < \mu$. Because $\mu' < \mu$ we must have $f(\mu') < f(\mu)$. But then $f(\mu') < f(a)$. This contradicts the previous paragraph. Hence $f(a) < f(\mu)$ for all $\mu \in (a, b)$.
The argument for $f(b) > f(x)$ for all $x \in (a, b)$ follows by a similar line of reasoning.