I know that for a set of vectors $\{ v_{1}, v_{2}, \ldots , v_{n} \} \in \mathbb{R}^{n}$ we can show that the vectors form a basis in $\mathbb{R}^{n}$ if we show that the coefficient matrix $A$ has the property $\det(A) \neq 0$, because this shows the homogeneous system has only the trivial solution, and the non-homogeneous system is consistent for every vector $(b_{1}, b_{2}, \ldots , b_{n}) \in \mathbb{R}^{n}$.
Intuitively, this concept seems applicable to all polynomials in $\mathbf{P}_{n}$ and all matrices in $M_{nn}$. Can someone validate this?
edit:
I think to make the intuition hold, $A$ must be defined as follows in $M_{nn}$:
Let $M_{1}, M_{2}, ... , M_{k}$ be matrices in $M_{nn}$.
To prove these form a basis for $M_{nn}$, we must show that $c_{1}M_{1} + c_{2}M_{2} + ... + c_{k}M_{k} = 0$ has the only trivial solution, and that every $n \times n$ matrix can be expressed as $c_{1}M_{1} + c_{2}M_{2} + ... + c_{k}M_{k} = B$.
So I believe that for $M_{nn}$, $A$ must be defined as a $n^{2} \times n^{2}$ matrix where each row vector is formed from all the $(i, j)$ entries taken from $M_{1}, M_{2}, ... , M_{k}$ (in that order.)
$\text{e.g. } A = \begin{pmatrix} M_{1_{1,1}} & M_{2_{1,1}} & ... & M_{k_{1,1}} \\ M_{1_{1,2}} & M_{2_{1,2}} & ... & M_{k_{1,2}} \\ ... & ... & ... & ... \\ M_{1_{n,n}} & M_{2_{n,n}} & ... & M_{k_{n,n}} & \end{pmatrix}$
However, I am not sure about this.