According to the Square Root Theorem for tensors (e.g. see here, here, or here), any positive definite symmetric tensor $A$ has a unique positive definite symmetric tensor $B$ such that $A=B^2$. (This appears to be a matrix-oriented view, just talking about rank-2 tensors.)
My question is specifically about the inverse metric tensor $g^{ij}$, which is a symmetric positive definite (assuming $g$ is SPD, so too is $g^{-1}$) type-(2,0) contravariant tensor.
Thus, there exists $\sigma$ such that $\sigma^2=g^{-1}$. How can I prove $\sigma$ is a tensor (i.e. its transformation law) if it indeed is one?
I know that we have the following transformation law: $$ \bar{g}^{ij} = \frac{\partial \bar{x}^i}{\partial x^a} \frac{\partial \bar{x}^j}{\partial x^b} g^{ab} $$ or $\bar{g}^{-1} = J^{-1}gJ^{-T}$.
In matrix terms, I can say $\bar{\sigma}^2 = J^{-1}\sigma\sigma J^{-T}$. Then because $\sigma$ is symmetric I can write $\bar{\sigma}\bar{\sigma}^T=J^{-1}\sigma(J^{-1}\sigma)^T$ and so maybe conclude $\bar{\sigma}=J^{-1}\sigma$.
Can anyone shed some light on this? In other words, if $\sigma$ is the matrix field computed from the matrix square root of the (matrix) components of $g^{-1}$, do the components of $\sigma$ form a tensor?