In Calculus by Spivak (1994), the author states in Chapter 18 p. 341 that $\exp(x+y) = \exp(x)\exp(y)$ implies $\exp(x) = [\exp(1)]^x$ He refers to the discussion in the beginning of the chapter where we define a function $f(x + y) = f(x)f(y)$; with $f(1) = 10$, it follows that $f(x) = [f(1)]^x$. But I don't get this either. Can anyone please explain this? Many thanks!
How does $\exp(x+y) = \exp(x)\exp(y)$ imply $\exp(x) = [\exp(1)]^x$?
-
0See also http://math.stackexchange.com/questions/141293/continuous-functions-on-mathbb-r-such-that-gxy-gxgy and http://math.stackexchange.com/questions/293371/how-do-i-prove-that-fxfy-fxy-implies-that-fx-ecx-assuming-f-is. – 2016-09-01
5 Answers
The $n$-th $\ (n\in{\mathbb N}_{\geq1}$) power of a real number $a>0$ is defined in elementary terms as product of $n$ factors equal to $a$. It is only reasonable to define $a^0:=1$ and $a^n:=1/a^{-n}$ when $n<0$. One then has $a^{m+n}=a^m\cdot a^n\ ,\qquad a^{m\ n}=\bigl(a^m\bigr)^n \qquad(*)$ for all $m$, $n\in{\mathbb Z}$. The aim now is to extend the definition of an $x$-th power of $a$ from $x\in{\mathbb Z}$ to $x\in{\mathbb Q}$ and $x\in{\mathbb R}$ such that the law $(*)$ is preserved.
The first step is easy: For $p\in{\mathbb Z}$ and $q\in{\mathbb N}_{\geq1}$ put $a^{p/q}\ :=\ \root q \of {a^p}\ .$ Using the "rules of algebra" one then can verify that $(*)$ is true for arbitrary $x$, $y\in{\mathbb Q}$ instead of $m$, $n\in{\mathbb Z}$. That is as far as one can get using the algebraic notions of $n$-th power and $q$-th root.
Now analysis provides a wonderful (in particular, continuous) function $x\mapsto\exp (x)$ with the property that $\exp(x+y)=\exp(x)\cdot\exp(y)$ for all $x$, $y\in{\mathbb C}$. Put $\exp(1)=:e\doteq 2.718$. Then it follows by an easy induction that in fact $\exp\Bigl({p\over q}\Bigr)\ =\ e^{p/q}$ for all ${p\over q} \in{\mathbb Q}$. As $\exp$ is continuous it is only natural to define arbitrary real powers of the number $e$ by $e^x\ :=\ \exp(x)\qquad(x\in{\mathbb R})\ .$ Via the logarithm function (which I won't explain here) we can define arbitrary real powers not only of $e$, but of an arbitrary number $a>0$: $a^x\ :=\ \exp\bigl(x\log(a)\bigr)\qquad (x\in{\mathbb R})\ .$ Using the functional equations of $\exp$ and $\log$ one then easily proves that the rule $(*)$ remains true for arbitrary $x$, $y\in{\mathbb R}$ instead of $m$, $n\in{\mathbb Z}$.
As Paul Pichaureau has mentioned, it is easy to see that the formula holds for $n$ natural. You can also easily go further: what should $\exp(-n)$ be? Well, $\exp(0) = 1 = \exp(-n+n) = \exp(-n) \exp(n)$, so $\exp(-n) = 1/\exp(n)$. Furthermore, it is easily seen that $\exp(nx) = \exp(x)^{n}$, so that $\exp(1) = \exp(n\cdot 1/n) = \exp(1/n)^n,$ and hence $\exp(1/n) = \exp(1)^{1/n}$ by the usual rules for powers. Combining these, we see that $\exp(p/q) = \exp(1)^{p/q},$ so the formula holds for any rational number $x$. I think this is about as far as you can get using only that formula (in addition to the requirement that $\exp(0) = 1$). If we, in addition, assume that $\exp$ is continuous on $\mathbb{R}$, then we can choose a sequence $(r_{i})_{i =1}^{\infty}$ of rational numbers such that $\lim_{n \to \infty} r_{i} = x$, and compute \lim_{n\to \infty} \exp(1)^{r_n} = \lim_{n \to \infty} \exp(r_{n}) = \exp(\lim_{n \to \infty} r_{n}) = \exp(x). The problem now is what $\exp(1)^x$ is supposed to be. If we already know that the power function $x \mapsto \exp(1)^x$ is continuous, then certainly the right hand side is equal to $\exp(1)^x$, but as Paul mentions, the power functions are often defined in terms of the exponential function. However, I don't think you have to do it that way (you could define rational powers as usual, and prove that a continuous extension to the reals exists), so it all depends on what Spivak has done prior to that chapter (you also didn't specify what $x$ is, so it conceivable that he assumes it is rational, in which case it doesn't matter much).
-
0excellent answer Martin, thorough and comprehensive without abstruosity – 2013-12-23
I think you should assume $x$ is an integer (since $a^x$ is defined using $\exp$ if $x$ is a positive real). You can write $\exp(x) = \exp(\underbrace{1+1+1+1+\cdots+1}_x)$.
Using the property of $\exp$, you find that $\exp(x) = \exp(1)\exp(1)\dots\exp(1) = (\exp(1))^x$.
-
0Note that $a^x = \exp( x \log(a))$ and is defined for all complex $x$ and $a$, s.t. $a \not= 0$, where the principal branch of $\log$ is taken. – 2012-02-05
I like (as I have done here before) to start with a functional equation and derive properties of the function.
If $f(x+y) = f(x) f(y)$ and $f$ is differentiable (and non-zero somewhere), $f(0) = 1$ and $f(x+h)-f(x) = f(x)f(h)-f(x) =f(x)(f(h)-1) =f(x)(f(h)-f(0)) $ so $(f(x+h)-f(x))/h = f(x)(f(h)-f(0))/h.$ Letting $h \to 0$, f'(x) = f'(0) f(x). From this, f(x) = \exp(f'(0)x).
This also works for $\ln$, $\arctan$, and $\sin$ and $\cos$.
Functional equations are fun!
$ \begin{gathered} \exp (x + y) = \exp (x)\exp (y)\quad \Rightarrow \hfill \\ \Rightarrow \quad \exp \left( {\sum\limits_j {x_j } } \right) = \prod\limits_j {\exp (x_j )} \quad \Rightarrow \hfill \\ \Rightarrow \quad \exp \left( {\sum\nolimits_{k\, = \,0}^{\,x} 1 } \right) = \prod\nolimits_{k\, = \,0}^{\,x} {\exp (1)} \quad \Rightarrow \hfill \\ \Rightarrow \quad \exp \left( x \right) = \exp (1)^{\,x} \hfill \\ \end{gathered} $
where
- $\sum\limits_j {} $ and $\prod\limits_j {} $ indicate the sum and product over a set of values for $j$
- $\sum\nolimits_{k\, = \,0}^{\,x} {} $ and $\sum\nolimits_{k\, = \,0}^{\,x} {} $ indicate the Indefinite sum/product, computed between the given limits
So the above holds also for $x$ and $y$ complex