5
$\begingroup$

I know that for a set of vectors $\{ v_{1}, v_{2}, \ldots , v_{n} \} \in \mathbb{R}^{n}$ we can show that the vectors form a basis in $\mathbb{R}^{n}$ if we show that the coefficient matrix $A$ has the property $\det(A) \neq 0$, because this shows the homogeneous system has only the trivial solution, and the non-homogeneous system is consistent for every vector $(b_{1}, b_{2}, \ldots , b_{n}) \in \mathbb{R}^{n}$.

Intuitively, this concept seems applicable to all polynomials in $\mathbf{P}_{n}$ and all matrices in $M_{nn}$. Can someone validate this?

edit:

I think to make the intuition hold, $A$ must be defined as follows in $M_{nn}$:

Let $M_{1}, M_{2}, ... , M_{k}$ be matrices in $M_{nn}$.

To prove these form a basis for $M_{nn}$, we must show that $c_{1}M_{1} + c_{2}M_{2} + ... + c_{k}M_{k} = 0$ has the only trivial solution, and that every $n \times n$ matrix can be expressed as $c_{1}M_{1} + c_{2}M_{2} + ... + c_{k}M_{k} = B$.

So I believe that for $M_{nn}$, $A$ must be defined as a $n^{2} \times n^{2}$ matrix where each row vector is formed from all the $(i, j)$ entries taken from $M_{1}, M_{2}, ... , M_{k}$ (in that order.)

$\text{e.g. } A = \begin{pmatrix} M_{1_{1,1}} & M_{2_{1,1}} & ... & M_{k_{1,1}} \\ M_{1_{1,2}} & M_{2_{1,2}} & ... & M_{k_{1,2}} \\ ... & ... & ... & ... \\ M_{1_{n,n}} & M_{2_{n,n}} & ... & M_{k_{n,n}} & \end{pmatrix}$

However, I am not sure about this.

  • 0
    @Alex: editing question to reflect this.2011-03-20

3 Answers 3

5

You should be aware that for any given $n$ there's an essentially unique real vector space of dimension $n$, in the sense that any two are isomorphic (although non-canonically). For instance, the space of real polynomials of degree $\leq n$ is a real vector space of dimension $n+1$, hence isomorphic to ${\Bbb R}^{n+1}$, the space $M_n({\Bbb R})$ of square $n\times n$ real matrices is a real vector space of dimension $n^2$, hence isomorphic to ${\Bbb R}^{n^2}$, and so on. Once you prove a particular statement for ${\Bbb R}^{n}$, you proved it for ALL real vector spaces of dimension $n$.

Same thing, more in general, when you consider vector spaces over any other field than $\Bbb R$.

  • 0
    @Rob: it is correct as long as you take as the standard basis of the space of matrices, the matrices which have a $1$ in one place and $0$ in all the others (which is what you seem to be implicitly doing).2011-03-23
3

I'm not exactly sure what you would mean by the coefficient matrix of a set of vectors in $P_n$ or $M_{nn}$. If you mean e.g. let $f \in P_n$ be $a_0 + a_1x + \ldots + a_nx^n$, identify this with the vector $(a_0, a_1, \ldots, a_n)$, then consider the matrix formed by $n+1$ such vectors, then they do form a basis if and only if the determinant of the matrix $\neq 0$. The reason, of course, is that I've implicitly written down the isomorphism between $P_n$ and $\mathbb{R}^{n+1}$. The same goes for $M_{nn}$ once I make some corresponding isomorphism with $R^{n^2}$.

  • 0
    No problem, edited to fix.2011-03-22
3

In general, a square matrix over a commutative ring is invertible if and only if its determinant is a unit in that ring. (Wikipedia)

By a unit is meant an element which has a multiplicative inverse in the ring. Since zero is the only element of $\mathbf{R}$ without a multiplicative inverse, a matrix over $\mathbf{R}^n$ is invertible with nonzero determinant (note: every field is a ring).

Since the invertibility of a matrix implies that its constituent column vectors span the space, the columns of an $n x n$ matrix in $\mathbf{F}^n$ are a basis for $\mathbf{F}^n$ iff the determinant of the matrix is a unit in the ring.