111
$\begingroup$

I was wondering the following. And I probably know the answer already: NO.

Is there another number with similar properties as $e$? So that the derivative of $\exp(x)$ is the same as the function itself.

I can guess that it's probably not, because otherwise $e$ wouldn't be that special, but is there a proof of it?

  • 0
    see another answer http://math.stackexchange.com/a/1292586/720312016-05-23

9 Answers 9

185

Of course $C e^x$ has the same property for any $C$ (including $C = 0$). But these are the only ones.

Proposition: Let $f : \mathbb{R} \to \mathbb{R}$ be a differentiable function such that $f(0) = 1$ and f'(x) = f(x). Then it must be the case that $f = e^x$.

Proof. Let $g(x) = f(x) e^{-x}$. Then

g'(x) = -f(x) e^{-x} + f'(x) e^{-x} = (f'(x) - f(x)) e^{-x} = 0

by assumption, so $g$ is constant. But $g(0) = 1$, so $g(x) = 1$ identically.

N.B. Note that it is also true that $e^{x+c}$ has the same property for any $c$. Thus there exists a function $g(c)$ such that $e^{x+c} = g(c) e^x = e^c g(x)$, and setting $c = 0$, then $x = 0$, we conclude that $g(c) = e^c$, hence $e^{x+c} = e^x e^c$.

This observation generalizes to any differential equation with translation symmetry. Apply it to the differential equation f''(x) + f(x) = 0 and you get the angle addition formulas for sine and cosine.

  • 2
    This is me again. I've helped you twice to earn gold badge for Great Answer. Congrats! ♥(ˆ⌣ˆԅ)2014-09-29
45

Let $f(x)$ be a differentiable function such that f'(x)=f(x). This implies that the $k$-th derivative, $f^{(k)}(x)$, is also equal to $f(x)$. In particular, $f(x)$ is $C^\infty$ and we can write a Taylor expansion for $f$:

$T_f(x) = \sum_{k=0}^\infty c_k x^k.$

Notice that the fact that $f(x)=f^{(k)}(x)$, for all $k\geq 0$, implies that the Taylor series $T_f(x_0)$ converges to $f(x_0)$ for every $x_0\in \mathbb{R}$ (more on this later), so we may write $f(x)=T_f(x)$. Since f'(x) = \sum_{k=0} (k+1)c_{k+1}x^k = f(x), we conclude that $c_{k+1} = c_k/(k+1)$. The value of $c_0 = f(0)$, and therefore, $c_k = f(0)/k!$ for all $k\geq 0$. Hence:

$f(x) = f(0) \sum_{k=0}^\infty \frac{x^k}{k!} = f(0) e^x,$

as desired.

Addendum: About the convergence of the Taylor series. Let us use Taylor's remainder theorem to show that the Taylor series for $f(x)$ centered at $x=0$, denoted by $T_f(x)$, converges to $f(x)$ for all $x\in\mathbb{R}$. Let $T_{f,n}(x)$ be the $n$th Taylor polynomial for $f(x)$, also centered at $x=0$. By Taylor's theorem, we know that $|R_n(x_0)|\leq |f^{(n+1)}(\xi)|\frac{ |x_0 - 0|^{n+1}}{(n+1)!},$ where $R_n(x_0)=f(x) - T_{f,n}(x)$ and $\xi$ is a number between $0$ and $x_0$. Let $M=M(x_0)$ be the maximum value of $|f(x)|$ in the interval $I=[-|x_0|,|x_0|]$, which exists because $f$ is differentiable (therefore, continuous) in $I$. Since $f(x)=f^{(n+1)}(x)$, for all $n\geq 0$, we have: $|R_n(x_0)|\leq |f^{(n+1)}(\xi)|\frac{ |x_0|^{n+1}}{(n+1)!}\leq |f(\xi)|\frac{ |x_0|^{n+1}}{(n+1)!}\leq M \frac{|x_0|^{n+1}}{(n+1)!} \longrightarrow 0 \ \text{ as } \ n\to \infty.$ The limit goes to $0$ because $M$ is a constant (once $x_0$ is fixed) and $A^n/n! \to 0$ for all $A\geq 0$. Therefore, $T_{f,n}(x_0) \to f(x_0)$ as $n\to \infty$ and, by definition, this means that $T_f(x_0)$ converges to $f(x_0)$.

  • 3
    +1 for submitting this answer while this may not get as many upvotes as Yuan's simple answer.2011-08-19
30

Yet another way: By the chain rule, {\displaystyle {d \over dx} \ln|f(x)| = {f'(x) \over f(x)} = 1}. Integrating, you get $\ln |f(x)| = x + C$. Taking $e$ to both sides, you obtain |f(x)| = e^{x + C} = C'e^x, where C' > 0. As a result, f(x) = C''e^x, where C'' is an arbitrary constant.

If you are worried about $f(x)$ being zero, the above shows $f(x)$ is of the form C''e^x on any interval for which $f(x)$ is nonzero. Since $f(x)$ is continuous, this implies $f(x)$ is always of that form, unless $f(x)$ is identically zero (in which case we can just take C'' = 0 anyhow).

  • 1
    Wow, I like this. It's so simple :D2017-01-29
25

Hint $\rm\displaystyle\:\ \begin{align} f{\:'}\!\! &=\ \rm a\ f \\ \rm \:\ g'\!\! &=\ \rm a\ g \end{align} \iff \dfrac{f{\:'}}f = \dfrac{g'}g \iff \bigg(\!\!\dfrac{f}g\bigg)' =\ 0\ \iff W(f,g) = 0\:,\ \ W = $ Wronskian

This is a very special case of the uniqueness theorem for linear differential equations, esp. how the Wronskian serves to measure linear independence of solutions. See here for a proof of the less trivial second-order case (that generalizes to n'th order). See also the classical result below on Wronskians and linear dependence from one of my old sci.math posts.

Theorem $\ \ $ Suppose $\rm\:f_1,\ldots,f_n\:$ are $\rm\:n-1\:$ times differentiable on interval $\rm\:I\subset \mathbb R\:$ and suppose they have Wronskian $\rm\: W(f_1,\ldots,f_n)\:$ vanishing at all points in $\rm\:I\:.\:$ Then $\rm\:f_1,\ldots,f_n\:$ are linearly dependent on some subinterval of $\rm\:I\:.$

Proof $\ $ We employ the following easily proved Wronskian identity:

$\rm\qquad\ W(g\ f_1,\ldots,\:g\ f_n)\ =\ g^n\ W(f_1,\ldots,f_n)\:.\ $ This immediately implies

$\rm\qquad\quad\ \ \ W(f_1,\ldots,\: f_n)\ =\ f_1^{\:n}\ W((f_2/f_1)',\ldots,\:(f_n/f_1)'\:)\quad $ if $\rm\:\ f_1 \ne 0 $

Proceed by induction on $\rm\:n\:.\:$ The Theorem is clearly true if $\rm\:n = 1\:.\:$ Suppose that $\rm\: n > 1\:$ and $\rm\:W(f_1,\ldots,f_n) = 0\:$ for all $\rm\:x\in I.\:$ If $\rm\:f_1 = 0\:$ throughout $\rm\:I\:$ then $\rm\: f_1,\ldots,f_n\:$ are dependent on $\rm\:I.\:$ Else $\rm\:f_1\:$ is nonzero at some point of $\rm\:I\:$ so also throughout some subinterval $\rm\:J \subset I\:,\:$ since $\rm\:f_1\:$ is continuous (being differentiable by hypothesis). By above $\rm\:W((f_2/f_1)',\ldots,(f_n/f_1)'\:)\: =\: 0\:$ throughout $\rm\:J,\:$ so by induction there exists a subinterval $\rm\:K \subset J\:$ where the arguments of the Wronskian are linearly dependent, i.e.

on $\rm\ K:\quad\ \ \ c_2\ (f_2/f_1)' +\:\cdots\:+ c_n\ (f_n/f_1)'\: =\ 0,\ \ $ all $\rm\:c_i'\:=\ 0\:,\ $ some $\rm\:c_j\ne 0 $

$\rm\qquad\qquad\: \Rightarrow\:\ \ ((c_2\ f_2 +\:\cdots\: + c_n\ f_n)/f_1)'\: =\ 0\ \ $ via $({\phantom m})'\:$ linear

$\rm\qquad\qquad\: \Rightarrow\quad\ \ c_2\ f_2 +\:\cdots\: + c_n\ f_n\ =\ c_1 f_1\ \ $ for some $\rm\:c_1,\ c_1'\: =\: 0 $

Therefore $\rm\ f_1,\ldots,f_n\:$ are linearly dependent on $\rm\:K \subset I\:.\qquad$ QED

This theorem has as immediate corollaries the well-known results that the vanishing of the Wronskian on an interval $\rm\: I\:$ is a necessary and sufficient condition for linear dependence of

$\rm\quad (1)\ $ functions analytic on $\rm\: I\:$
$\rm\quad (2)\ $ functions satisfying a monic homogeneous linear differential equation
$\rm\quad\phantom{(2)}\ $ whose coefficients are continuous throughout $\rm\: I\:.\:$

  • 3
    @Mic I presumed the "is constant" inference was clear. The point of mentioning the Wronskian is to emphasize that this is a special case of more general results (for those who may know such). On Wronskians see also [my sci.math post here.](http://groups.google.com/groups?selm=y8zfznjojwo.fsf@nestle.ai.mit.edu)2011-08-17
15

Here is a different take on the question. There is a whole spectrum of different discrete "calculi" which converge to the continuous case, each of which has it's special "$e$".

Pick some $t>0$. Consider the equation $f(x)=\frac{f(x+t)-f(x)}{t}$ It is not hard to show by induction that there is a function $C_t:[0,t)\to \mathbb{R}$ so that $f(x)=C_t(\{\frac{x}{t}\})(1+t)^{\lfloor\frac{x}{t}\rfloor}$ where $\{\cdot\}$ and $\lfloor\cdot\rfloor$ denote fractional and integer part, respectively. If I take Qiaochu's answer for comparison, then $C_t$ plays the role of the constant $C$ and $(1+t)^{\lfloor\frac{x}{t}\rfloor}$ the role of $e^x$. Therefore for such a discrete calculus the right value of "$e$" is $(1+t)^{1/t}$. Now it is clear that as $t\to 0$ the equation becomes f(x)=f'(x), and $(1+t)^{1/t}\to e$.

  • 1
    Very cool... falemenderit.2013-04-29
12

The solutions of f(x) = f'(x) are exactly $f(x) = f(0) e^x$. But you can also write it as $b a^x$, if you want a different basis. Then f'(x) = b \log(a) a^x, and so if you want f'=f you need $\log(a)=1$ and $a=e$ (except for the trivial case $b=0$).

11

The proof they use at High School, so not as deep or instructive, but it doesn't require as much knowledge.

$\begin{eqnarray*} \frac{dy}{dx} &=& y\\ \frac{dx}{dy} &=& \frac 1 y\\ x &=& \log |y| + C\\ y &=& A\exp(x) \end{eqnarray*} $

  • 0
    This method does fail to find the solution $y=0$ which is an indication of its lack of rigor! (More formally you'd need to split it into cases before inverting; if there is one place where $y=0$ it is intuitively clear that $y=0$ everywhere, but one has to be careful, since e.g. $y=x^2$ has $y=0$ and $dy/dx=0$ at $x=0$ without vanishing everywhere!) Nevertheless this approach seems to be the only one so far that does not rely on knowing the answer before setting out the strategy. Effectively, it's just treating $dy/dx=y$ as a separable differential equation & applying the standard techniques.2012-07-04
3

Let $x \in C^1$ on the whole line be a solution to $\dot{x}(t) = x(t)$, $x(0) = 1$. Using the Taylor expansion with remainder, show that necessarily $x(t) = e^t$.

We have that $\dot{x} = x$ implies $x^{(n)} = x^{(n-1)}$ for all $n \ge 1$, and by induction on $n$, we have that $x(t)$ is $C^\infty$ with $x^{(n)} = x$ for all $n$. Thus, if $x(0) = 1$ and $\dot{x} = x$, Taylor's Theorem gives$x(t) = \left( \sum_{k=0}^{N-1} {{t^k}\over{k!}}\right) + {{x^{(N)}(t_1)}\over{N!}}t^N,$for $t_1$ between $0$ and $t$. But $x^{(N)} = x$, so if$M = \max_{|t_1| \le |t|} |x(t)|,$which we know exist by compactness of $[-|t|, |t|]$, then$\left| x(t) - \sum_{k=0}^{N-1} {{t^k}\over{k!}}\right| < {{Mt^N}\over{N!}}.$The right-hand side heads to $0$ as $N \to \infty$, so the series for $e^t$ converges to $x(t)$.

-2

Note that $e$ is defined by the following Limit: $e=\lim_{n \rightarrow \infty}(1+ \frac{1}{n})^n$. Then: $e^x=\lim_{n \rightarrow \infty}(1+ \frac{1}{n})^{nx}$. Applying the Definition of the derivative $f'(x) = \lim_{h \rightarrow 0} \frac{f(x+h)-f(x)}{h}$ one obtains: $(e^x)'=\lim_{h \rightarrow 0} \frac{ \lim_{n \rightarrow \infty}((1+ \frac{1}{n})^{n(x+h)}-(1+ \frac{1}{n})^{nx})}{h} = \lim_{h \rightarrow 0}( \lim_{n \rightarrow \infty}(1+\frac{1}{n})^{nx} \lim_{n \rightarrow \infty}(\frac{(1+\frac{1}{n})^{nh}-1}{h}))$

$= e^x \lim_{h \rightarrow 0} \lim_{n \rightarrow \infty}(\frac{(1+\frac{1}{n})^{nh}-1}{h})$.

Now one can replace $h$ by $n$ by the relation $h= \frac{C}{n}$ with a finite constant $C$, because if $n \rightarrow \infty$ then $h$ tends to Zero. Hence:

$\lim_{h \rightarrow 0} \lim_{n \rightarrow \infty}(\frac{(1+\frac{1}{n})^{nh}-1}{h}) = \lim_{h \rightarrow 0} (\frac{(1+\frac{h}{C})^{C}-1}{h}) = \lim_{h \rightarrow 0} (\frac{(1+C \frac{h}{C} + \frac{C(C-1)}{2}(\frac{h}{C})^2+O(h^3)-1}{h}) = \lim_{h \rightarrow 0} (1 + \frac{C(C-1)}{2}\frac{h}{C^2}+O(h^2)) = 1$

Therefore $(Ce^x)'=C(e^x)'=Ce^x$.

q.e.d.

  • 0
    Late to the party but all this proves is that $\left(Ce^x\right)'=Ce^x$, not that $f(x)=f'(x)\iff f(x)=Ce^x$.2016-09-07