2
$\begingroup$

I say no, because if $f'(a)>0$, this means that at $a$, function $f$ is increasing at $a$, so $a$ cannot be the maximum of the interval.

Is this correct and how may I start a formal proof of it? As always, much appreciated.

  • 2
    What does it mean for a function to be increasing at a single point?2017-01-31
  • 3
    Hint for a formal proof: recall the definition of the derivative involves $(f(a+h)-f(a))/h$. What happens for small positive $h$?2017-01-31
  • 1
    @UmbertoP.: A function $f$ defined in some neighborhood of a point $a$ is said to be *increasing at point $a$* if there is a neighborhood $I$ of $a$ such that $x\in I, x < a \Rightarrow f(x)\leq f(a)$ and $x \in I, x > a\Rightarrow f(x) \geq f(a)$. The definition can be adapted for the strict versions also by using strict inequalities. If $f'(a) > 0$ is $f$ is strictly increasing at $a$ but converse may not be true.2017-02-01

2 Answers 2

4

Assume that $a$ is a (local) maximum. Therefore there exists $\epsilon>0$ such that for any $0 < h < \epsilon$ we have $f(a+h) \leq f(a)$. In particular

$$\frac{f(a+h)-f(a)}{h} \leq 0$$

and thus going to the limit yields

$$f'(a)=\lim_{h\to 0}\frac{f(a+h)-f(a)}{h} \leq 0$$

Contradiction.

2

take $\delta>0$ such that $00$, wich implies $f(a+h)>f(a)$, for all $h\in ]0,\delta[$.