1
$\begingroup$

enter image description here

Hello,

I need your help in clearing my concepts. I have a problem in Case II of above question

It is given that $f''(x) > 0$ from this information it can be deduced that $f'(x)$ is strictly increasing and $f'(a) > f'(b)$ for every $a > b$ in the domain of the function. Then how can $f'(x) - f'(1-x)$ be possible for all $x$ in $(0,1)$.

Please point out where I am doing wrong.

Thank you

1 Answers 1

2

It is not claimed that this is true for all $x$ in $(0,1)$. It's just saying that if $g(x)$ is decreasing on some subinterval $(a,b)\subseteq(0,1)$, then $g'(x)<0$ for all $x\in (a,b)$ and hence $f'(x)-f'(1-x)<0$ for all $x\in (a,b)$. As you have observed, this cannot be true for all $x\in (0,1)$ since $f'$ is increasing, and indeed the text ends up concluding that this only happens for $x\in(0,1/2)$.

  • 0
    A curiosity...You posted nice answers most of the times on both MSE and MO, and I just wonder why you chose to drop out of grad school? a person with your math background surely could obtain a PhD in math, but that it has to be at the right school. May you share some viewpoints?2017-01-30