Let $u:\mathbb{R}\to\mathbb{R}^3$ where $u(t)=(u_1(t),u_2(t), u_3(t))$ be a function that satisfies $\frac{d}{dt}|u(t)|^2+|u|^2\le 1,\tag{1}$where $|\cdot|$ is the Euclidean norm. According to Temam's book paragraph 2.2 on page 32 number (2.10), inequality (1) implies $|u(t)|^2\le|u(0)|^2\exp(-t)+1-\exp(-t),\tag{2}$but I do not understand why (1) implies (2).
inequality in a differential equation
1 Answers
The basic argument would go like this. Go ahead and let $f(t) = |u(t)|^2$, so that equation (1) says $f'(t) + f(t) \leq 1$. We can rewrite this as $\frac{f'(t)}{1-f(t)}\leq 1.$ Let $g(t) = \log(1 - f(t))$. Then this inequality is exactly that $-g'(t)\leq 1.$ It follows that $g(t) = g(0) + \int_0^t g'(s)\,ds\geq g(0) - t.$ Plugging in $\log(1 - f(t))$ for $g$, we see that $\log(1-f(t)) \geq \log(1 - f(0)) - t.$ Exponentiating both sides gives $1 - f(t) \geq e^{-t}(1 - f(0)),$ which is exactly the inequality you are looking for.
Edit: In the case when $f(t)>1$, the argument above doesn't apply because $g(t)$ is not defined. Instead, take $g(t) = \log(f(t) -1)$, so that $g'(t)\leq -1$. It follows that $g(t) \leq g(0) - t$ (at least for small enough $t$ that $g$ remains defined), which upon exponentiating again gives the desired inequality.
-
0@Fabian: Edited to include this case. – 2012-05-12