1
$\begingroup$

I have a homework Question to answer which is:

True or False: Between 2 sequential roots of $f'(x)$ there is at most one root for $f(x)$

I think this is true since $f(x)$ would be monotone unless $f'(x)$ is equal to $0$ on an interval - but in that case what are 2 sequential values of the roots?

Can someone help me with this question?

Thanks :)

  • 0
    Not true if it isn't differentiable at all values on the interval, even if continuous. But true if differentiable everywhere in the interval.2012-01-06

4 Answers 4

2

Suppose $f$ has two roots $x=a$ and $x=b$. That is $f(a)=f(b)=0$. Assuming $f'$ exists on $[a,b]$, by the Mean Value Theorem (or Rolle's Theorem if you prefer), there is a $c\in(a,b)$ with $f'(c)=0$.

This implies that if $c$ and $d$ are sequential roots of $f'$ and if $f'$ exists on $[c,d]$, then $f$ has at most one root between $c$ and $d$.


Added:

I am interpreting "between two sequential roots of $f'\ $" to mean that there is a $c$ and a $d$ with $f'(c)=f'(d)=0$ and $f'(x)$ is non-zero for all $x$ between $c$ and $d$.

  • 0
    Thanks :) I this mostly - but only visually. How do you explain the implication mathematically?2012-01-06
  • 1
    @Jason I thought I just did :) Simply put, the MVT implies that $f'$ has a root between any two roots of $f$. If $a$f(a)=f(b)=0$, then there exists a $c$ with $a2012-01-06
  • 0
    Yea, That's great but why does it imply the "at most one" part?2012-01-06
  • 0
    I think this answer is false, beginning at the line "This implies that ..."2012-01-06
  • 1
    @Jason If $f'$ had roots at 1 and 5, and if $f$ had roots at 2 and 4, then $f'$ would have a root between 2 and 4; and thus the roots 1 and 5 of $f'$ would not be $sequential$. $$\ $$In the problem, you are considering two roots of $f'$ and you know that $f'$ has no roots in between these two (that's why they are called sequential).2012-01-06
  • 0
    Oh OK, That cleats it up. Thanks :)2012-01-06
2

When $f'$ is not continuous then very strange things may occur. So let's assume that $f'$ is continuous, that $f'(a)=f'(b)=0$ and that $f'(x)\ne0$ for $a0$ for all $x\in\ ]a,b[\ $ or $f'(x)<0$ for all $x\in\ ]a,b[\ $. Assuming the former it follows that $f$ is strictly increasing on $[a,b]$, so there is at most one zero of $f$ in this interval.

0

Hint: http://en.wikipedia.org/wiki/Mean_value_theorem

  • 0
    Hmm I was thinking of using this somehow - just have not quite figured it out yet, good to see I'm in the right direction though. Do Feel free to hint some more :P2012-01-06
0

It must be true, because the root of $f'(x)$ means it is maximum or minimum of $f(x)$, so the function in this interval is whether decreasing or rising, therefore there can't be more than one root of $f(x)$

Indeed, it can be a curving point. But even if it is a curving point, there is only 1 root in this interval.

  • 2
    The root of $f'(x)$ is not definitely a max or min point it can be a curving point for example like in $x^3$2012-01-06
  • 1
    Just for info: it's called a "point of inflexion".2012-01-06
  • 0
    @TonyK :) Thanks, I am translating these terms as I am not learning in English - but thanks for the extra knowledge2012-01-06