I recently saw the following inequality for complex numbers:
If $a,b\in\mathbb C$ and $|a + b|$ and $|a-b|$ are each less than or equal to 1, then
$|a| + |b^2|/2 \leq 1.$
How can one prove this?
I recently saw the following inequality for complex numbers:
If $a,b\in\mathbb C$ and $|a + b|$ and $|a-b|$ are each less than or equal to 1, then
$|a| + |b^2|/2 \leq 1.$
How can one prove this?
First, $|b| \leq 1$, since
$ 2|b| = |b + a + b - a| \leq |b + a| + |b - a| \leq 2, $
as noted by DonAntonio.
The conditions $|a + b| \leq 1$ and $|a - b| \leq 1$ imply that $a$ is in the intersection of the closed balls of radius $1$ centered at $b$ and $-b$, which is the shaded region here:
Since we essentially only care about the maximum modulus of $a$, we may rotate this region about the origin:
The two circles intersect at $z = \pm \sqrt{1 - |b|^2}$, which implies $|a| \leq \sqrt{1 - |b|^2}$.
Then, since the map $x \mapsto \sqrt{1-x^2} + x^2/2$ decreases from $1$ to $1/2$ in the interval $[0,1]$, we have
$ |a| + |b|^2/2 \leq \sqrt{1 - |b|^2} + |b|^2/2 \leq 1. $
$2|a|=|a+b+a-b|\leq|a+b|+|a-b|\leq 2\Longrightarrow |a|\leq 1$ and of course the same's true for $\,|b|\,$ , so now the inequality's trivial.