1
$\begingroup$

How can I solve this equation, $x = -c_1 e ^ x + c_2e ^{-x}, \;\;\; 0 < c_1, c_2 < 1$ We can use $t = e^x$ which will result in, $t \ln(t) + c_1 t ^ 2 - c_2 = 0, \;\;\; 0 < c_1, c_2 < 1$ but how can I solve this one then?

  • 0
    @Gortaur: I did edit the question and added a constraint. Both $c_1$ and $c_2$ must be in the range (0, 1). Thanks.2011-04-18

1 Answers 1

-2

Let's find first derivative of the both sides of the equation:

x'=(-C_1e^x)'+(C_2e^{-x})'

$1=-C_1e^x - C_2e^{-x}$ and now let's find first derivative of the left and right side:

(1)'=(-C_1e^x)' - (C_2e^{-x})'

$0=-C_1e^x + C_2e^{-x} \Rightarrow C_1e^x=C_2e^{-x}$ , which means that:

$x=-C_1e^x + C_1e^x$

$x=0$

  • 0
    @J.M,you are right...obvious logical mistake2011-09-18