3
$\begingroup$

We learn trigonometric functions in high school, but their treatment is not rigorous. Then we learn that they can be defined by power series in a college. I think there is a gap between the two. I'd like to fill in the gap in the following way.

Consider the upper right quarter of the unit circle $C$ = {$(x, y) \in \mathbb{R}^2$; $x^2 + y^2 = 1$, $0 \leq x \leq 1$, $y \geq 0$}. Let $\theta$ be the arc length of $C$ from $x = 0$ to $x$, where $0 \leq x \leq 1$. By the arc length formula, $\theta$ can be expressed by a definite integral of a simple algebraic function from 0 to $x$. Clearly $\sin \theta = x$ and $\cos\theta = \sqrt{1 - \sin^2 \theta}$. Then how do we prove that the Taylor expansions of $\sin\theta$ and $\cos\theta$ are the usual ones?

  • 1
    Anybody would like to post an answer using complex analysis?2012-08-22

5 Answers 5

7

This is what I consider the 'standard proof' for $\sin (x)$, and the proof that I would give tomorrow if suddenly asked of me. ($\cos(x)$ is more or less identical)

  1. Recall Taylor's Theorem. If we want to pay a bit more attention to the basis of the proof, then Taylor's Theorem can be proven from the mean value theorem.
  2. Recall that the derivative of $\sin (x)$ is $\cos (x)$, and that the derivative of $\cos(x)$ is $-\sin (x)$. These are often done geometrically.
  3. Note that successive derivatives of $\sin$ look like $\cos(x), -\sin(x), -\cos(x), \sin(x), ...$ cyclically.
  4. Use Taylor's Theorem to expand $\sin(x)$ around $0$.
  5. We have to check to see that the expansion we've attained actually converges to $\sin(x)$. To do this, we'll use the Lagrange form of the remainder, which is obtained during the standard proofs of Taylor's Theorem. The trick here is that the $k$th remainder term is always bounded in absolute value by $\dfrac{x^{k+1}}{(k+1)!}$ because the successive derivatives are always just a $\sin$ or $\cos$ (up to sign). It's easy to see that for any $x$, as $k \to \infty$, this error term goes to $0$. Thus the expansion does converge to $\sin(x)$.
  6. Repeat for $\cos(x)$ if desired.
  • 0
    Ok. Perhaps this will be a theme. Again, it can be. (Not just by drawing the picture, but in that a single clever picture will contain all that's necessary for a rigorous proof).2012-08-24
1

Since $x^2 + y^2 = 1$, $y = \sqrt{1 - x^2}$,

$y' = \frac{-x}{\sqrt{1 - x^2}}$

By the arc length formula,

$\theta = \int_{0}^{x} \sqrt{1 + y'^2} dx = \int_{0}^{x} \frac{1}{\sqrt{1 - x^2}} dx$

We consider this integral on the interval [$-1, 1$] instead of [$0, 1$]. Then $\theta$ is a monotone strictly increasing function of $x$ on [$-1, 1$]. Hence $\theta$ has the inverse function defined on [$\frac{-\pi}{2}, \frac{\pi}{2}$]. We denote this function also by $\sin\theta$. We redefine $\cos\theta = \sqrt{1 - \sin^2 \theta}$ on [$\frac{-\pi}{2}, \frac{\pi}{2}$].

Since $\frac{d\theta}{dx} = \frac{1}{\sqrt{1 - x^2}}$,

(sin $\theta)' = \frac{dx}{d\theta} = \sqrt{1 - x^2} =$ cos $\theta$.

On the other hand, $(\cos\theta)' = \frac{d\sqrt{1 - x^2}}{d\theta} = \frac{d\sqrt{1 - x^2}}{dx} \frac{dx}{d\theta} = \frac{-x}{\sqrt{1 - x^2}} \sqrt{1 - x^2} = -x = -\sin\theta$

Hence

$(\sin\theta)'' = (\cos\theta)' = -\sin\theta$

$(\cos\theta)'' = -(\sin\theta)' = -\cos\theta$

Hence by the induction on $n$,

$(\sin\theta)^{(2n)} = (-1)^n\sin\theta$

$(\sin\theta)^{(2n+1)} = (-1)^n\cos\theta$

$(\cos\theta)^{(2n)} = (-1)^n\cos\theta$

$(\cos\theta)^{(2n+1)} = (-1)^{n+1}\sin\theta$

Since $\sin 0 = 0, \cos 0 = 1$,

$(\sin\theta)^{(2n)}(0) = 0$

$(\sin\theta)^{(2n+1)}(0) = (-1)^n$

$(\cos\theta)^{(2n)}(0) = (-1)^n$

$(\cos\theta)^{(2n+1)}(0) = 0$

Note that $|\sin\theta| \le 1$, $|\cos\theta| \le 1$.

Hence, by Taylor's theorem,

$\sin\theta = \sum_{n=0}^{\infty} (-1)^n \frac{\theta^{2n+1}}{(2n+1)!}$

$\cos\theta = \sum_{n=0}^{\infty} (-1)^n \frac{\theta^{2n}}{(2n)!}$

QED

Remark:

When you consider the arc length of the lemniscate instead of the circle, you will encounter $\int_{0}^{x} \frac{1}{\sqrt{1 - x^4}} dx$. You may find interesting functions like we did with $\int_{0}^{x} \frac{1}{\sqrt{1 - x^2}} dx$. This was young Gauss's approach and he found elliptic functions.

0

One way to fill the gap is by using the theory of differential equations: First the derivatives of sin and cos can be geometrically established. For a very rigorous proof, one should start from a very advanced presentation of elementary geometry, define an angle in a very precise way, define a function which measures the angles,(all this is done in this book) then go on to establish the addition formulae and the derivatives. For details you can consult any good book on calculus such as the one by Apostol. Hence it can be verified that $\sin$ satisfies the differential equation $y''+y=0, y(0)=0, y'(0)=1$.

Now by the method of power series of solving differential equations we find that the solution of the above differential equation is an infinite series $\sum_{n=0}^\infty \frac{(-1)^n x^{2n+1}}{(2n+1)!}$. By the existence and uniqueness theorem of differential equations there should only be one solution so $\sin (x) =\sum_{n=0}^\infty \frac{(-1)^n x^{2n+1}}{(2n+1)!}$ Furthermore since power series are unique so this is also the Taylor series.

0

there is a simplified elementary derivation of the power series without the use of Taylor Series. It can be done through the expansion of the multiple angle formula. See paper by David Bhatt, “Elementary Derivation of Sine and Cosine Series”, Bulletin of the Marathwada Mathematical Society, 9(2) 2008, 10–12