3
$\begingroup$

My book, professor, and friends make this theorem look very significant:

If A is an m by n matrix, then the following are either all true or all false:

  • Each vector b in R^m is a linear combination of the columns of A.

  • For each b in R^m, the equation Ax = b has a solution.

  • The columns of A span R^m.

  • A has a pivot position in every row.

My question is: isn't that a fairly obvious tautology? I mean, the definition of matrix multiplication simply expands the second equation into a linear combination, so why do people get so excited about this?

Sometimes I get the feeling that, in linear algebra, we're just finding fifty ways to state the same thing in different words, and getting excited even though they all stem from the same definition. :\

(Apologies in advance about the formatting, I'm not sure how the math formatting works here.)

  • 0
    Perhaps another point to make is that the equivalence of the first two statements tell you that using matrices is not merely a convenient book-keeping device, as it might seem at first (based on the definition of "coefficient matrix" and "augmented matrix" of a system of linear equations), but actually gives you an algebraic structure to play with.2011-01-31

1 Answers 1

4

I have seen many, many students (and you have probably also seen them yourself) who would answer a definitive 'No' to your first question.

Of course it is obvious to most of us, and if it is obvious to you as well, then fantastic. As Hans said, you can move on to bigger and better things. If you are looking for more difficult and deeper results in linear algebra (I notice that you are worried linear algebra is a study in the invariants under change of notation) then perhaps you can look in to eigenvalue problems, and if you have studied some calculus, how they relate to the solutions of ODEs.

Linear algebra is one of those fields which crops up all the time. For example, a standard method of solving a very non-linear system of equations involves 'linearising' the system, whereupon certain nice theorems from linear theory apply, and then returning to the non-linear situation. While I'm not saying that linear algebra is all one needs to solve linear PDEs (there is a fair bit of analysis after all!), I am trying to say that linear algebra plays a role, and thus attempt to convey the far-reaching nature of linear algebra as a discipline.

I do not know what you are looking for as an answer to this question after the excellent comments you have received already. But I think that one thing to keep in mind is what Willie already mentioned: mathematics is an ever-changing discipline, where 'good' definitions are those which allow one to prove 'obvious' facts are true. If you find this hard to believe, I can give one model example, again from PDEs. A large class of solutions to second order parabolic equations remain positive, if initially so. However for fourth (and higher) order equations this is not true. The unsatisfactory point is the extent to which it is not true. The negative part of the solution is very 'small' when compared with the magnitude of the positive part. In certain situations one can make this precise, by speaking about the 'average positivity' or 'eventual positivity'. Both of these are variations on the definition of 'positive' which allow us to quantify the behaviour we see from these equations. Before discovery, it was very hard to show this. (In fact, the opposite was a conjecture of Hadamard.) But now, working backward from a modern definition of 'almost positive' or 'eventually positive', the proof appears quite simple---almost elementary!

I hope this answer complements the other comments and instills a little bit of that faith Mariano mentioned ;).