So I have gotten stumped on something that seems like it (should?) be easy. I am trying to find the following derivative shown below. I have scoured the wiki link on matrix derivatives, and I think my answer is correct but I want to make sure.
So let us say we have a square matrix $\boldsymbol{A}$, and a vector $\boldsymbol{\theta}$. (I am assuming here that the dimensionality is 2 for ease. So:
$\boldsymbol{A} = \begin{bmatrix} a_{11} & a_{12} \\ a_{21} &a_{22}\end{bmatrix}, \boldsymbol{\theta}= \begin{bmatrix} \theta_0 \\ \theta_1\end{bmatrix}$
I am trying to derive how we get:
$ \frac{\delta \boldsymbol{A}\boldsymbol{\theta}}{\delta \boldsymbol{\theta}}= \boldsymbol{A} $
So first I tried to 'open up' the matrix-vector product, so I now have the following matrix:
$ \begin{bmatrix} a_{11}\theta_{0} + a_{12}\theta_{1} \\ a_{21}\theta_{0} + a_{22}\theta_{1} \end{bmatrix}_{2x1} $
... and this is where I am stuck. How do I show from here, that the derivative of the above is indeed equal to $\boldsymbol{A}$? I know that I have to take the partials, but I cannot seem to find a rule governing in what ordering of columns/rows those partials must be taken.