3
$\begingroup$

How does one solve a "differential equation" for $\sigma$ of the form $ \sigma(v)w_i(v)={\partial \over \partial v_j}\left[\sigma(v)A_{ij}(v)\right] \quad i=1,\dots,n. $ where the summation convention applies.

$w,v$ are an $n$-D vectors, $\sigma$ is a scalar function, $A$ is an invertible $n\times n$ matrix?

Perhaps there is a general solution form? References (links) for the treatment of such an equation is also appreciated.

Thank you.

Added:

In light of drak's suggestion, here is a bit more

Some thoughts:

It might be friendlier to change "variables" to $A\sigma$?

Is there a more familiar expression for the index notation ${\partial \over \partial v_j}M_{ij}(v)$ such as one in terms of $\nabla$? It would seem to me that it is taking the divergence of each row of the matrix $M$.

Some more thoughts: since the function $\sigma$ appears on both sides of the equation, it is likely that it is an exponential.

A simplified version: What if we suppose that $A$ is a constant matrix?

  • 0
    If this can help, your equation is equivalent to $\nabla\sigma(v)=\sigma(v)A^{-1}[w(v)-B(v)]$ where $B(v):=(\text{div}A_1(v),\ldots,\text{div}A_n(v))$ with $A_i(v)$ the $i$-th row from $A$.2012-07-06

1 Answers 1

1

$\def\w{{\bf w}} \def\A{{\bf A}} \def\B{{\bf B}} \def\v{{\bf v}} \def\u{{\bf u}} \def\grad{\nabla} \def\darg{{\overleftarrow \nabla}} \def\t{\tau}$There is a notation used in physics that can handle these sorts of operations without indices. The differential equation takes the form $\begin{equation*} (\A \sigma)\darg = \w \sigma,\tag{1} \end{equation*}$ where $(\B\darg)_{i} = \frac{\partial}{\partial v_j} B_{ij}$. Then $(\A\darg + \A\grad)\sigma = \w\sigma,$ so $\A\grad\sigma = (\w - \A\darg)\sigma$, or $\begin{equation*} \frac{1}{\sigma} \grad\sigma = \A^{-1}(\w - \A\darg).\tag{2} \end{equation*}$ This is the equation given by @Mercy in the comments.

A natural ansatz is $\sigma = e^\t$, since $e^{-\t}\grad e^\t = \grad \t$. Thus, we must solve $\grad \t = \A^{-1}(\w - \A\darg),$ to which we can apply the gradient theorem. We find $\t(\v) - \t(\v_0) = \int_{\v_0}^{\v} d\u^T\, \A^{-1}(\u)\left(\w(\u) - \A(\u)\darg_\u\right).$ Therefore, \begin{equation*} \sigma(\v) = \sigma(\v_0)\exp \int_{\v_0}^{\v} d\u^T\, \A^{-1}(\u)\left(\w(\u) - \A(\u)\darg_\u\right).\tag{3} \end{equation*} Let's make sure we can unwind this expression. It is shorthand for $\sigma(\v) = \sigma(\v_0) \exp \int_{\v_0}^{\v} d u_i\, (A^{-1}(\u))_{ij}\left(w_{j}(\u) - \frac{\partial}{\partial u_k} A_{jk}(\u)\right).$ Note that the exponent is a scalar. It is the line integral of the vector field $\A^{-1}(\w - \A\darg)$.

Special case

Suppose $\A$ and $\w$ are constant and $\v_0 = 0$. The solution is then $\sigma(\v) = \sigma(0) \exp \left(\v^T\A^{-1}\w\right),$ which satisfies the differential equation (1) since $(\A \sigma)\darg = \A\grad \sigma = \A \A^{-1}\w \sigma = \w\sigma$. (We choose the gradient to be a column vector so $\grad(\v^T\A^{-1}\w) = \A^{-1}\w$.)