3
$\begingroup$

Using Mean Value theorem show that if $f'(x)>0$ at some point in $[a, b]$, prove that the set of points in $[a, b]$ for which $f'(x)>0$ is infinite.

Edit:

The function $f(x)$ is differentiable in $[a, b]$.

I struck with this problem and can't find a solution. Please help me.

  • 5
    I think you need a stronger condition than what you have. I believe that f' needs to be continuous. Otherwise it could be $0$ at just a single point.2017-01-22
  • 0
    It's not necessary to $f$ be differentiable in $[a,b]$.2017-01-22

2 Answers 2

3

If $f'(x)>0$ for all $x\in[a,b]$, we are done. Otherwise, suppose there is a point $x_0$ with $f'(x_0)\le 0$. $f'$ has the intermediate value property, so if it takes the value $y_0>0$, it takes all the values $0

  • 1
    Why does intermediate value theorem apply to $f'$?2017-01-22
  • 0
    @Neal it will apply if f' is continuous otherwise the theorem isn't actually true. I suspect that we're missing some of the question.2017-01-22
  • 0
    All functions which arise as derivatives have the intermediate value property, see [here](https://en.wikipedia.org/wiki/Darboux's_theorem_(analysis))2017-01-22
  • 0
    @lordoftheshadows the derivative always have the IVP regardless of any aasumptions. It is called the Darbaux theorem2017-01-22
  • 0
    yes, [Darboux's Theorem](https://en.wikipedia.org/wiki/Darboux's_theorem_(analysis))2017-01-22
  • 1
    The OP does not state that $f$ is differentiable on $[a,b]$, only at one point. That excludes Darboux's theorem absent additional hypotheses.2017-01-22
  • 1
    @vadim123 There are functions which are differentiable in only one point, and discontinuous everywhere else. So if you say that $f$ is differentiable only at one point, the conclusion is not true. Take for example $f$ to be $x+x^2$ for rational $x$ and $x-x^2$ for irrational $x$ on [-1,1]. It has the derivative $1$ in $x=0$ but is discontinuouos everywhere else.2017-01-22
  • 0
    @vadim123 The function $f(x)$ is differentiable in $[a, b]$2017-01-22
  • 0
    OK, it seems it has been clarified by the OP. But i still have 0 votes :)2017-01-22
0

(Proof without MV)

Let $c\in[a,b]$ such that $f'(c)>0$. This assumption has requirement that the limit $$f'(c)=\lim_{h\to0}\frac{f(c+h)-f(c)}{h}>0$$ be exist in a neighborhood $N_\delta(c)$ for some $\delta>0$. This says that for $x\in N_\delta(c)$, $f$ is continuous, since $f'(c)>0$, $\exists\alpha$ which $f'(c)>\alpha>0$, by continuation $\exists c_0\in N_\delta(c)$ which $f(c_0)=\alpha$. Then $f'(x)>0$ in $(c,c_0)$ or $(c_0,c)$.

  • 0
    I'm afraid I lost the thread at $f(c_0)=\alpha$2017-01-22
  • 0
    @Momo We pick up $c_0$ from interval which $f$ is continuous.2017-01-22
  • 0
    What I mean is that you may add or subtract any constant from $f$ without changing $f'$. So $f$ can be made negative on a neighborhood of $c$ WLOG, and I don't understand where $f(c_0)=\alpha>0$ is coming from.2017-01-22