3
$\begingroup$

How does one solve a "differential equation" for $\sigma$ of the form $$ \sigma(v)w_i(v)={\partial \over \partial v_j}\left[\sigma(v)A_{ij}(v)\right] \quad i=1,\dots,n. $$ where the summation convention applies.

$w,v$ are an $n$-D vectors, $\sigma$ is a scalar function, $A$ is an invertible $n\times n$ matrix?

Perhaps there is a general solution form? References (links) for the treatment of such an equation is also appreciated.

Thank you.

Added:

In light of drak's suggestion, here is a bit more

Some thoughts:

It might be friendlier to change "variables" to $A\sigma$?

Is there a more familiar expression for the index notation ${\partial \over \partial v_j}M_{ij}(v)$ such as one in terms of $\nabla$? It would seem to me that it is taking the divergence of each row of the matrix $M$.

Some more thoughts: since the function $\sigma$ appears on both sides of the equation, it is likely that it is an exponential.

A simplified version: What if we suppose that $A$ is a constant matrix?

  • 0
    Since you are a new user, here are a few things about the site you should know: 1. To get the best possible answers, it is helpful if you say where the problem originated, 2. You will get a better response, if you indicate, what you have already tried to answer the question yourself. and finally: Welcome to math.SE!2012-07-03
  • 0
    @draks : Thank you, I have added something to my post.2012-07-03
  • 0
    Consider the ODE $\frac{d}{dt} p(t)u(t) = q(t)u(t)$ ($u$ is the unknown). How would you solve it? Can you always "change variables" like $v(t)=p(t)u(t)$?2012-07-03
  • 0
    @Siminore: hmm, perhaps not... but then what can I do?2012-07-03
  • 0
    @Siminore: what if we start with $A$ is a constant matrix?2012-07-03
  • 0
    If $A$ is constant, and since $\sigma$ is a real-valued function, the equation should be easier.2012-07-03
  • 0
    @Siminore: thank you, is there a general form of solution for such an equation? I suspect it is some exponential function, but i am new to matrix/vector differential equations so i don;t know how it should look like. could you please help me out?2012-07-03
  • 0
    Formally it is like the ODE case. But you need to learn the exponentiation of a matrix.2012-07-03
  • 0
    If this can help, your equation is equivalent to $$\nabla\sigma(v)=\sigma(v)A^{-1}[w(v)-B(v)]$$ where $B(v):=(\text{div}A_1(v),\ldots,\text{div}A_n(v))$ with $A_i(v)$ the $i$-th row from $A$.2012-07-06

1 Answers 1

1

$\def\w{{\bf w}} \def\A{{\bf A}} \def\B{{\bf B}} \def\v{{\bf v}} \def\u{{\bf u}} \def\grad{\nabla} \def\darg{{\overleftarrow \nabla}} \def\t{\tau}$There is a notation used in physics that can handle these sorts of operations without indices. The differential equation takes the form $$\begin{equation*} (\A \sigma)\darg = \w \sigma,\tag{1} \end{equation*}$$ where $(\B\darg)_{i} = \frac{\partial}{\partial v_j} B_{ij}$. Then $$(\A\darg + \A\grad)\sigma = \w\sigma,$$ so $\A\grad\sigma = (\w - \A\darg)\sigma$, or $$\begin{equation*} \frac{1}{\sigma} \grad\sigma = \A^{-1}(\w - \A\darg).\tag{2} \end{equation*}$$ This is the equation given by @Mercy in the comments.

A natural ansatz is $\sigma = e^\t$, since $e^{-\t}\grad e^\t = \grad \t$. Thus, we must solve $$\grad \t = \A^{-1}(\w - \A\darg),$$ to which we can apply the gradient theorem. We find $$\t(\v) - \t(\v_0) = \int_{\v_0}^{\v} d\u^T\, \A^{-1}(\u)\left(\w(\u) - \A(\u)\darg_\u\right).$$ Therefore, \begin{equation*} \sigma(\v) = \sigma(\v_0)\exp \int_{\v_0}^{\v} d\u^T\, \A^{-1}(\u)\left(\w(\u) - \A(\u)\darg_\u\right).\tag{3} \end{equation*} Let's make sure we can unwind this expression. It is shorthand for $$\sigma(\v) = \sigma(\v_0) \exp \int_{\v_0}^{\v} d u_i\, (A^{-1}(\u))_{ij}\left(w_{j}(\u) - \frac{\partial}{\partial u_k} A_{jk}(\u)\right).$$ Note that the exponent is a scalar. It is the line integral of the vector field $\A^{-1}(\w - \A\darg)$.

Special case

Suppose $\A$ and $\w$ are constant and $\v_0 = 0$. The solution is then $$\sigma(\v) = \sigma(0) \exp \left(\v^T\A^{-1}\w\right),$$ which satisfies the differential equation (1) since $(\A \sigma)\darg = \A\grad \sigma = \A \A^{-1}\w \sigma = \w\sigma$. (We choose the gradient to be a column vector so $\grad(\v^T\A^{-1}\w) = \A^{-1}\w$.)