5
$\begingroup$

I'm studying for math exam and one of the questions that often appears is related to derivative of a product of two functions.

The theorem says that $(f(x)g(x))'=f'(x)g(x)+f(x)g'(x)$.

The proof goes like this: $f(x+h)g(x+h)-f(x)g(x)=(f(x+h)-f(x))g(x)+(g(x+h)-g(x))f(x)$

After that we divide the equation by h and let h approach 0.

Now what I don't understand is how they got the right side of the proof.

Same problem with quotient:

Theorem says: $\left(\frac{f(x)} {g(x)}\right)'=\frac{f'(x)g(x)-f(x)g'(x)} {g^2(x)}$

The above comes from $\frac {f(x+h)} {g(x+h)} - \frac {f(x)} {g(x)} = \frac{(f(x+h)-f(x))g(x)-(g(x+6h)-g(x))f(x)} {g(x+h)g(x)}$

I can see from where $f(x+h)g(x)$ comes, but I can't see from where $f(x)g(x)$ came.

  • 0
    I have a little problem with a$n$swers here. @damia$n$o wrote in a comment what I wanted to see, so I cannot accept that as an answer. On the other hand there are lots of really nice answers. I'd really like to accept damiano's answer, but the hand answer by @Grigory M is the most upvoted one. Should I wait for the moment (which may or may not come) when damiano posts his comment as an answer or should I accept Grigory M's answer?2010-08-28

9 Answers 9

1

(Adding and subtracting the same amount, like $f(x)g(x)$, is very common in mathematics. Think of what you do when you solve a linear equation - add+subtract multiply+divide.)

Here is a geometrical proof of the product rule.

Denote by $T_f$ the tangent of $f(x)$ at $x_0$. It is given by $T_f(x)=f'(x_0)(x-x_0) + f(x_0)$.

Denote by $T_g$ the tangent of $g(x)$ at $x_0$. It is given by $T_g(x)=g'(x_0)(x-x_0) + g(x_0)$.

The slope of $f(x)g(x)$ at $x_0$ is $(fg)'(x_0)$ - which we want to find. That slope is the same as the slope of the product of the tangents at $x_0$: that is $(fg)'(x_0)=(T_fT_g)'(x_0)$. Multiplying through and differentiate we get $(T_fT_g)'(x) = ((f'(x_0)(x-x_0) + f(x_0))\cdot(g'(x_0)(x-x_0) + g(x_0)))'=$ $=2(x-x_0)f'(x_0)g'(x_0)+ f'(x_0)g(x_0) + f(x_0)g'(x_0)$ so $(fg)'(x_0)=(T_fT_g)'(x_0) =f'(x_0)g(x_0) + f(x_0)g'(x_0).$

10

The derivative product rule is a special case of the congruence product rule for rings, i.e.

Product Rule $\rm\quad\ a\equiv a_1,\; b\equiv b_1 \;\Rightarrow\;\; ab \equiv a_1 b_1$

Proof: $\rm\quad\; ab-a_1 b_1 \equiv a(b-b_1)+(a-a_1)b_1 \equiv 0 \quad$ QED

Thus $\rm\ \ f(x+t) \equiv \: f(x) + f'(x) \; t \;\pmod {t^2}$

and $\rm\ \ \ \,g(x+t) \equiv g(x) + g'(x) \; t \;\pmod {t^2}$

$\rm\ \ \Rightarrow\ \ f(x+t)g(x+t) \:\equiv\: f(x)g(x) + (f'(x)g(x) + f(x)g'(x)) \; t \:\;\pmod {t^2}$

$\rm\displaystyle\ \Rightarrow\ \ \frac{f(x+t)g(x+t)\: - \:f(x)g(x)}{t} \equiv\: f'(x)g(x) + f(x)g'(x) \quad\:\pmod t$

In fact this is how one universally defines derivatives in formal polynomial rings $\rm R[x]$, e.g. see here. This yields a purely algebraic approach to polynomial derivatives - devoid of limits or other topological notions.

The ring $\rm R[t]/t^2 \;$ is known as the algebra of dual numbers over the ring $\rm R$. This ring and its higher order analogs $\;\rm R[t]/t^n \;$ prove quite useful when studying (higher) derivations algebraically since they provide convenient algebraic models of tangent / jet spaces. E.g. as above, they permit easy transfer of properties of homomorphisms to derivations -- see for example section 8.15 in Jacobson, Basic Algebra II.

Dual numbers have been applied in many contexts, e.g. deformation theory [2], numerical analysis [3] (along with Levi-Civita fields), where they're viewed simply as truncated Taylor power series, and also in Synthetic Differential Geometry (SDG) [1], another rigorization of infinitesimals based on work of Lawvere and Kock. Note that SDG employs these nilpotent infinitesimals, unlike Abraham Robinson's nonstandard analysis, which employs invertible infinitesimals (hence contains infinite elements).

[1] Bell, J. L. Infinitesimals. Synthese 75 (1988) #3, 285--315.
http://www.jstor.org/stable/20116534

[2] Szendroi, B. The unbearable lightness of deformation theory, a tutorial introduction.
http://people.maths.ox.ac.uk/szendroi/defth.pdf

[3] M. Berz, Differential Algebraic Techniques,
in "Handbook of Accelerator Physics and Engineering, M. Tigner, A.Chao (Eds.)" (World Scientific, 1998)
http://bt.pa.msu.edu/cgi-bin/display.pl?name=dahape
http://bt.pa.msu.edu/NA/
http://bt.pa.msu.edu/pub/papers/

  • 2
    @BillDubuque I very much appreciate the way you consistently try to give an overarching feel for the interconnectedness of mathematics rather than answering the question in a more pointilistic manner. Certainly it would be easier to answer the question in such a manner, and thus I have more reason to appreciate your hints at the beautiful "bigger picture." $(+1)$2012-11-07
8

There is a more comprehensible (I hope) proof coming from thinking of derivative as a linear approximation.

By the definition of derivative1, $f(x)=f(x_0)+f'(x_0)(x-x_0)+o(x-x_0)$ and $g(x)=g(x_0)+g'(x_0)(x-x_0)+o(x-x_0)$. Multiplying the equalities we get \begin{multline} f(x)g(x)=f(x_0)g(x_0)+(f'(x_0)g(x_0)+f(x_0)g'(x_0))(x-x_0)+\\ +f'(x_0)g'(x_0)(x-x_0)^2+o(x-x_0) \end{multline} and since $f'(x_0)g'(x_0)(x-x_0)^2=o(x-x_0)$, we can rewrite it as \[(fg)(x)=(fg)(x_0)+(f'g+fg')(x_0)\cdot(x-x_0)+o(x-x_o)\] — which exactly means that (fg)'=f'g+fg'.

1 $\lim_{x\to x_0}g(x)=a$ iff $g(x)=a+o(1)$, hence $\lim_{x\to x_0}\frac{f(x)-f(x_0)}{x-x_0}=f'(x_0)$ iff $\frac{f(x)-f(x_0)}{x-x_0}=f'(x_0)+o(1)$ which can be rewritten as $f(x)-f(x_0)=f'(x_0)(x-x_0)+o(x-x_0)$ (see e.g. wikipedia if "o(1)" notation is unfamiliar to you).

  • 0
    @AndrejaKo updated the answer with a formal proof (and see J.Mangaldan's comment for an informal explanation); probably someone else can provide a reference (I know nothing about English-language calculus textbooks; but any _good_ one should do, probably)2010-08-23
8

An equivalent way to state the product rule is $\frac{(fg)'}{fg} = \frac{f'}{f} + \frac{g'}{g}$. I prefer this statement because it is more intuitive: it says precisely that the relative instantaneous change in $fg$ is the sum of the relative instantaneous change in $f$ and the relative instantaneous change in $g$. In other words, this is just an expression of the approximation $(1 + a)(1 + b) \approx 1 + a + b$ when $a, b$ are both small ("multiplication near the identity is addition"). In other other words, this is an expression of the familiar fact that if you increase something by 5%, then by 5% again, the total increase is just a little more than 10%.

As for turning this into a formal proof, divide both sides of the expression you're confused about by $fg$. (This is not strictly valid if $f$ or $g$ is equal to zero, but I'm going for clarity here.)

  • 1
    The "logarithmic derivative" form is also useful when reckoning how uncertainties in data might propagate.2010-08-24
5

This answer is really very similar to the others, but I hope that the emphasis is sufficiently different to be worth posting.

Fix a number $x$. The definition of the derivative of $f$ is that $f'(x)=\lim_{h\to 0}\frac{f(x+h)-f(x)}{h}.$ Let's rewrite this as $f'(x)=\lim_{h\to 0}F(h)$ where we define $F(h)=\frac{f(x+h)-f(x)}{h}$ for $h\ne0$. We can rewrite this definition as $f(x+h)=f(x)+hF(h).$ For the same reason, $g(x+h)=g(x)+hG(h)$ where $G(h)=\frac{g(x+h)-g(x)}{h}$ and also $g'(x)=\lim_{h\to 0}G(h).$

We now need to investigate the derivative of $f(x)g(x)$. Now by definition this is $\lim_{h\to0}\frac{f(x+h)g(x+h)-f(x)g(x)}{h}.$ Manipulating this quotient gives $\frac{f(x+h)g(x+h)-f(x)g(x)}{h}=\frac{(f(x)+hF(h))(g(x)+hG(h))-f(x)g(x)}{h}$ $=\frac{hf(x)G(h)+hF(h)g(x)+h^2F(h)G(h)}{h}=f(x)G(h)+F(h)g(x)+hF(h)G(h).$ Now this yields to the "algebra of limits": using $\lim_{h\to 0}F(h)=f'(x)$ and $\lim_{h\to 0}G(h)=g'(x)$ we get $\lim_{h\to0}\frac{f(x+h)g(x+h)-f(x)g(x)}{h}=f(x)g'(x)+f'(x)g(x)+0f'(x)g'(x)$ which simplifies to $f(x)g'(x)+f'(x)g(x)$ which is exactly what you want.

Actually this argument is just the same as your original but expressed in a more pedestrian way, without any cunning sleights of hand. As follow-up a good exercise is to obtain the quotient rule in a similar fashion.

4

The following is an attempt at an interpolation between (among?) the answers of Grigory M., Pierre-Yves Gaillard and Bill Dubuque (the last of which I confess that I do not completely understand, but this is one possible interpretation of it).

Let $J$ be an open interval in $\mathbb{R}$, and let $x_0 \in J$. Following PYG's suggestion, let me begin by stating exactly what I will prove: if $f,g: J \rightarrow \mathbb{R}$ are differentiable at $x_0$, then so is their product, and

$(fg)'(x_0) = f'(x_0)g(x_0) + f(x_0) g'(x_0)$.

I will prove this "congruentially", as follows:

let $R$ be the ring of all functions $f: J \rightarrow \mathbb{R}$ which are continuous at $x_0$, under the operations of pointwise addition and multiplication. Inside $R$, consider the set $I$ of all functions $f$ such that

$\lim_{x \rightarrow x_0} \frac{f(x)}{x-x_0} = 0$.

I claim that $I$ is an ideal of $R$. This is easy to prove, but I note that it makes use of the fact that every element of $R$ is continuous at $x_0$, hence bounded near $x_0$. Now:

1) For $f \in R$, $f$ lies in $I$ iff: $f(x_0) = 0$, $f$ is differentiable at $x_0$ and $f'(x_0) = 0$.

2) For $f \in R$, $f$ is differentiable at $x_0$ iff there exists $A \in \mathbb{R}$ such that $f \equiv f(x_0) + A(x-x_0)$. If so, then necessarily $A = f'(x_0)$; in particular, it is uniquely determined.

3) Thus, if $f$ and $g$ are both differentiable at $x_0$, then

$fg \equiv (f(x_0) + f'(x_0)(x-x_0))(g(x_0) + g'(x_0)(x-x_0))$

$\equiv f(x_0)g(x_0) + (f'(x_0)g(x_0) + f(x_0)g'(x_0))(x-x_0) + f'(x_0)g'(x_0)(x-x_0)^2$

$\equiv f(x_0)g(x_0) + (f'(x_0)g(x_0) + f(x_0)g'(x_0))(x-x_0) \pmod I$.

Using 2), it follows that $fg$ is differentiable at $x_0$ and

$(fg)'(x_0) = f'(x_0)g(x_0) + f(x_0)g'(x_0)$.

3

Here is an answer based on Carathéodory's formulation of differentiability. Let $a\in A\subset\mathbb R$ and $f:A\to\mathbb R$. Assume there a sequence in $A\setminus\{a\}$ converging to $a$. Then there is at most one function $\varphi:A\to\mathbb R$ which is continuous at $a$ and satisfies $f(x)=f(a)+(x-a)\ \varphi(x)$ for all $x$ in $A$. If such a $\varphi$ exists, we say that $f$ is differentiable at $a$, and we put $f'(a):=\varphi(a)$.

It's easy to see that this definition is equivalent to the usual one, and, using it, it's easy to verify that the product of two functions differentiable at $a$ is differentiable at $a$ and satisfies the famous formula for the derivative of a product. (Same thing for the Chain Rule.)

See "The Derivative à la Carathéodory", Stephen Kuhn, The American Mathematical Monthly, Vol. 98, No. 1 (Jan., 1991), pp. 40-44: http://www.jstor.org/pss/2324035 or http://www.mediafire.com/?xyl23spk3ui0cga

  • 0
    I just tried to explain briefly Carathéodory's formulation. I also tried to be as precise and clear as possible. It seems that I failed. Please tell me why. [I think my answer is not completely unrelated to yours.]2010-08-25
2

I'll let someone else take care of proving the product formula; the quotient formula follows from applying the product formula and the chain rule.

Now, remember that you can express $\frac{f(x)}{g(x)}$ as $f(x)\cdot\left(\frac1{g(x)}\right)$. You can now apply the product formula as follows:

$\frac{\mathrm{d}}{\mathrm{d}x}\frac{f(x)}{g(x)}=\left(\frac{\mathrm{d}}{\mathrm{d}x} f(x)\right)\left(\frac1{g(x)}\right)+f(x)\left(\frac{\mathrm{d}}{\mathrm{d}x}\frac1{g(x)}\right)$

The left addend is easily simplified:

$\frac{\mathrm{d}}{\mathrm{d}x}\frac{f(x)}{g(x)}=\frac{f^{\prime}(x)}{g(x)}+f(x)\left(\frac{\mathrm{d}}{\mathrm{d}x}\frac1{g(x)}\right)$

the right addend requires the chain rule:

$\frac{\mathrm{d}}{\mathrm{d}x}\frac{f(x)}{g(x)}=\frac{f^{\prime}(x)}{g(x)}+f(x)\left(-\frac1{g(x)^2}\right)\left(\frac{\mathrm{d}}{\mathrm{d}x}g(x)\right)$

and the formula follows from there.

If you will be doing the proof using limits, just apply the steps you took in proving the product rule, with the terms being $f(x)$ and $\frac1{g(x)}$

2

As per AndrejaKo's comment, I am posting my comment as an answer. Note that there are very interesting and useful answers to this question motivating the formula for the derivative of a product: my answer is simply a technical explanation of a typo in the argument provided in the question.

You should add and subtract the quantity $f(x)g(x+h)$ from the left hand side of your equation (and the equality you wrote is indeed incorrect). Once you collect terms appropriately, everything should be clear!