2
$\begingroup$

I'm reading an explanation of how to solve first-order differential equations. Part of the way through, I have this:

If $\frac{dR}{dx}=RP$, $$\begin{align*} \frac{dR}{R} &= Pdx, \\ \int \frac{dR}{R} &= \int Pdx, \\ \ln R &=\int Pdx +c. \end{align*} $$

Now, this makes me uneasy: I don't like to separate variables. I do it when I integrate with a substitution, but I'm perfectly aware when I'm doing it that I've slipped out of math for a second to manipulate my symbols for convenience, and I could do it more formally if I had to.

That's what I want to do here, but I can't figure it out. Given the first line, how do I solve for R without separation of the variables?

I should add that P and R are both functions of x.

  • 0
    Minor note: $\int\frac{dR}{R} = \ln|R| + C$.2012-09-06
  • 0
    @Joe That's the other thing that makes me uneasy. I knew that it should be that but this is straight from the textbook.2012-09-06

3 Answers 3

1

I had this question before too, and I looked through Art of Problem Solving's Calculus textbook at one point to see if they had a more formal way of doing it. I believe this is more or less what I came across:

We can divide both sides by $R$, which gives

$\dfrac{1}{R} \dfrac{dR}{dx} = P$.

This can be integrated with respect to $x$ which gives

$\displaystyle \int \dfrac{1}{R} \dfrac{dR}{dx} dx = \int P dx$.

Then by the Chain Rule, the LHS equals $\displaystyle \int \dfrac{1}{R} dR$, and the remainder of the integration can be carried out. Wikipedia (http://en.wikipedia.org/wiki/Separation_of_variables) mentions that this last step is due to the "substitution rule for integrals."

  • 0
    This is exactly what I wanted. Thank you! I knew it was something like this but couldn't figure it out.2012-09-06
  • 0
    It is by the substitution rule. If you have $\int y^{-1} \, dy$ and you substitute $y = R(x)$ then $dy = R' \, dx$ and so: $$ \int \frac{dy}{y} \equiv \int \frac{R'}{R} \, dx = \ln|R| + C .$$2012-09-06
0

You could notice that you have the derivative of a function equaling the function itself scaled by a constant.

Then, you know that only a certain class of functions has that property, namely $e^x$. You could then assume that $R = e^{kx}+C$, and plug it into the equation and solve for $k$.

Somehow, this doesn't seem more satisfying, however.

  • 0
    Actually that's a great idea.2012-09-06
  • 0
    Yep. That's exactly what you would get if you continue the analysis in your OP out to completion =)2012-09-06
  • 0
    Actually wait. kx should end up being $\int Pdx$.2012-09-06
  • 0
    As Ed says: this doesn't seem any more satisfying. This "great idea" assumes that $F' = kF$. But how do you know this is a valid assumption? You need to show that $F(x) = e^{kx}$ are the only solutions to $F' = kF$. But how do you do this without separating variables? You're back to your original problem.2012-09-06
  • 0
    Slight note: $R=Ae^{kx}$, the $+C$ is not valid in this case.2012-09-06
  • 0
    @FlybyNight: you can at least prove that any such function must be infinitely differentiable, and any analytic solution must be exponential by considering power series. I can't off the top of my head think how to discount infinitely differentiable non-analytic solutions...2012-09-07
  • 0
    @BenMillwood: I agree. I wasn't suggesting that there were other solutions. I just wanted to point out that the "solution" was given by assuming the solution.2012-09-07
  • 0
    @FlybyNight This why I say it's not quite satisfying, but it's a common technique, and one that can be particularly frustrating (especially given its common use in ODEs). Things like assuming an ansatz in perturbation theory, assuming convergence of an infinite series, etc. are commonplace. Usually, these assumptions have other underlying mechanics which makes them more "valid", but assuming a solution then showing that it is a solution is common.2012-09-07
  • 0
    @EdGorcenski: Yeah, I know. I was just agreeing with you and emphasising why it was not satisfactory. I wasn't having a go.2012-09-07
  • 0
    Oh I know. I just wanted to clarify the commonality for the OP, just in case :)2012-09-07
  • 0
    @FlybyNight: I mentioned power series, what's wrong with those as a method?2012-09-07
0

Well, you haven't slipped out of math at all. Actually, this kind of Newtonian manipulation of ${\rm d}x$ works very often and can be made rigorous if necessary.

Here, observe that if the differential equation $${R'(x) \over R(x)} = P(x)$$ holds on some interval $[a, b]$ then it is also certainly the case that $$\int_a^y {R'(x) \over R(x)} {\rm d} x = \int_a^y P(x)$$ for any $y \in [a,b]$. Now use the substitution $u = \log(R(x))$, ${\rm d}u = {R' {\rm d}x \over R}$ to get $$\log(R(y)) + C = \int_{\log(R(a))}^{\log(R(y))} {\rm d} u = \int_a^y P(x)$$ again for all $y \in [a,b]$ and so this can be also translated into the language of indefinite integral by the fundamental theorem of calculus.

In other words, that juggling of infinitesimals is actually substitution under the integral sign (and in particular valid as long as hypothesis of substitution theorems are satisfied).

  • 0
    Why the downvote?2012-09-06