32
$\begingroup$

I recently saw a lecturer prove the following theorem (assuming the result that every analytic function is locally 1-1 whenever its derivative is nonzero): Let $\Omega \subset \mathbb{C}$ be open, and let $f : \Omega \to \mathbb{C}$ be 1-1 and analytic on $\Omega$. Then $f'(z_0) \not = 0$ for every $z_0 \in \Omega$.

I got the basic idea behind the proof: we assume for contradiction that $f'(z_0) = 0$, and, assuming without loss of generality that $z_0 = f(z_0) =0$, we have (from the power-series expansion) that $f(z) = z^kg(z)$ for some analytic $g$ in some disk at the origin (i.e., $z_0$) and some $k \ge 2$. Since $z^k$ is not 1-1 in any such disk (because there are multiple roots of unity), then $f$ isn't either.

However, the proof he gave was rather awkward and technical- it involved defining three different axillary functions, even though the idea was simple, and I've since forgotten how it exactly worked. In any case, I'm convinced there's a better way.

The problem is that I'm having trouble turning the idea into a real proof- I know that it obviously follows if $g$ is 1-1, but I'm also pretty sure that that is too strong an assumption. Am I missing something, or does the argument just have to be more complicated?

  • 0
    It may be hard to answer whether there's a less complicated argument if we don't know what the argument was. One proof is in Theorem 7.4 of J.B. Conway's complex analysis text: http://books.google.com/books?id=9LtfZr1snG0C&pg=PA98#v=onepage&q&f=false2011-04-27
  • 0
    Google isn't letting me see the page containing the proof.2011-04-27
  • 1
    (Apparently, enter posts comments.) I found some notes that I took- the proof involved dividing both sides by $g(0)$ (which seems unnecessary), then defining a new function $\psi$ by $\psi(z) = g(z)/g(0)$, so that $f(z)/g(0) = z^k\psi(z)$. $\psi$ takes values in a disc away from $0$, so its log is well-defined. We can then put $\phi(z) = z \exp(\log(\psi(z)) /k)$, so that $\phi(z^k) = z^k \psi(z)$, and then it's easy to show that $f$ isn't 1-1, since $\phi$ is, as $\phi'(0) = 1$. That actually isn't as bad as I remembered- I think it's because he didn't assume $z_0 = f(z_0) = 0$.2011-04-27
  • 0
    Isn't this just the inverse function theorem?2011-04-27
  • 0
    In complex-variable form, yes.2011-04-27
  • 4
    @gary: No, the inverse function theorem implies the converse, that if the derivative is nonzero then the function is locally injective.2011-04-27
  • 0
    I'd argue that although the idea is simple, any proof will have to be at least a little "awkward" or "technical" for the reason that the statement is analogous to other statements that are false. For example $f: \mathbb{R} \to \mathbb{R}$ given by $x \mapsto x^3$ is one-to-one and in many senses it's as nice a function from $\mathbb{R}$ to $\mathbb{R}$ as you might want (e.g. it is real analytic) but $f'(0) = 0$. Of course the $'$ in $f'$ means something different here, but it's certainly analogous. So any proof of your statement will really have to do *something*. Just my two cents.2011-04-27
  • 0
    Jonas:I was referring to what I think is the result that an analytic injection has an analytic inverse. Then the inverse would necessarily have a specific formula given by the IFTheorem which does not allow for f'(zo)=0 in the region.2011-04-27
  • 0
    @JonasMeyer But the converse is true, here, isn't it ?2015-03-31
  • 0
    @Gato: I don't know what you're asking, but I can repeat what I said in my comment: "If the derivative is nonzero then the function is locally injective." That is sort of a converse. Of course it must be qualified with "locally" as $z\mapsto e^z$ demonstrates. This converse direction is where the inverse function theorem is relevant. But I already said that, which is why I don't know what you're asking, and maybe I was just unclear.2015-03-31
  • 0
    @JonasMeyer Ok, is it obvious that 'f the derivative is nonzero then the function is locally injective' ? What 'argument' is needed ?2015-03-31
  • 0
    @Gato: Have you looked up the inverse function theorem?2015-03-31

2 Answers 2

33

I like proving this theorem via its contrapositive rather than by contradiction (though the computations are essentially the same).

Suppose $f:\Omega\to\mathbb{C}$ is analytic with $f'(z_0)=0$. The goal is to show that every disc about the origin contains distinct $z_1,z_2$ with $f(z_1)=f(z_2)$. We may assume that $z_0=f(z_0)=f'(z_0)=0$ (using $f(z+z_0)-f(z_0)$ if necessary, as translation doesn't affect injectivity). Since $f$ is analytic at $z=0$ and $f'(0)=f(0)=0$, $f$ has a power series expansion $$f(z)=a_k z^k + a_{k+1} z^{k+1} + \dots$$ where $k>1$. Pulling out a $z^k$ gives $$f(z)=z^k (a_k + a_{k+1}z + \dots) = z^k g(z)$$ where $g$ is analytic with $g(0)\neq0$. Since $g$ is nonzero on a sufficiently small disc centered at the origin, we can define an appropriate branch its log so that its kth root is well-defined. Call this function $h$, so that $h$ is analytic with $h(z)^k=g(z)$ near the origin. Hence $$f(z)=\left(zh(z)\right)^k.$$ Note that $\phi(z)=zh(z)$ is analytic (near $z=0$). Therefore, for any $\epsilon>0$ (sufficiently small), $\phi(D(0,\epsilon))$ is open (by the Open Mapping Theorem) and hence contains a disc $D(0,2\delta)$. In particular, there exist $z_1,z_2\in D(0,\epsilon)$ with $\phi(z_1)=\delta$ and $\phi(z_2)=\delta \exp\left(\frac{2\pi i}{k}\right)$. Therefore $$f(z_2)=\delta^k \exp\left(\frac{2\pi i}{k}\right)^k = \delta^k = f(z_1)$$ as desired.


The proof does look a bit clunky, especially with all the auxilliary functions. However, it's actually fairly simple and the extra functions are really just to show why each step is valid. In fact, the gist of the proof is:

  1. Show that $f$ is the kth power of some analytic function $\phi$

  2. Show that you can always find $z_1,z_2$ where $\phi(z_1)$ and $\phi(z_2)$ lie on the same circle and their arguments differ by $\frac{2\pi}{k}$, so that their k-th powers are equal.

  • 0
    Thanks. I think the essential part is saying things like "$g$ is nonzero on a sufficiently small disc centered at the origin, we can define an appropriate branch its log so that its k-th root is well-defined", instead of proving it in detail.2011-04-28
  • 1
    I don't understand the last step. How do we know there exists $z_1,z_2$ such that $\phi(z_2)=\delta exp(\frac{2i\pi}{k})$?2015-02-09
  • 0
    What ensures $z_{1}\neq z_{2}$?2016-04-05
  • 0
    Very nice and neat solution. May I ask you a question; does the converse of this result true?2017-08-04
  • 0
    @DiegoFonseca well they map to different elements, hence they must be different2018-01-05
31

Suppose $f$ is analytic at $z_0$ and non-constant, but $f'(z_0) = 0$. Then the order of the zero of $f(z) - f(z_0)$ at $z_0$ is some integer $k > 1$. Take some circle $\Gamma$ around $z_0$ so $f$ is analytic on and inside $\Gamma$ and there are no other zeros of $f(z) - f(z_0)$ or $f'(z)$ on or inside $\Gamma$. Now the sum of the orders of the zeros of $f(z) - p$ inside $\Gamma$ is $\dfrac{1}{2\pi i}\int_\Gamma \frac{f'(z)}{f(z) - p}\, dz.$, which is equal to $k$ for $p$ in a neighbourhood of $f(z_0)$. But since $f'(z)$ has no other zeros inside $\Gamma$, those zeros are simple, i.e. there are $k$ distinct solutions to $f(z) = p$ inside $\Gamma$.

  • 0
    Robert Israel, which theorem did you use for counting the sum of the orders of zeros of $f(z)-p$?2014-01-14
  • 0
    I'm not aware that the theorem has an official name. It's just from computing the residue of $f'(z)/(f(z)-p)$ at $z = r$ where $f(z) - p = g(z) (z - r)^m$ and $g(r) \ne 0$.2014-01-14
  • 5
    Thanks, I think that this is the argument principle.2014-01-18
  • 0
    In fact, Rouche's theorem is proven along similar lines.2018-12-07