1
$\begingroup$

Let $A=(a_{ij})\in M_n$ be an arbitrary matrix and let $A_1=\begin{pmatrix} a_{11}\\ a_{21}\\ \vdots\\ a_{n1}\\ \end{pmatrix}$ $A_2=\begin{pmatrix} a_{12}\\ a_{22}\\ \vdots\\ a_{n2}\\ \end{pmatrix}\ldots$ $A_n=\begin{pmatrix} a_{1n}\\ a_{2n}\\ \vdots\\ a_{nn}\\ \end{pmatrix}\in M_{n1}$ be columns of $A$. Prove that if the set$\{A_1,A_2,...,A_n\} $ is linearly dependent in vector space $M_{n1}$, then $\det A=0$.

I know this already has an answer here but I don't understand OP's solution.

$\lambda_1 A_1 + \ldots + \lambda_n A_n = 0$ where not all $\lambda_i$ are zero. Suppose that $\lambda_1 \neq 0$. Then we get \begin{align*} A_1 = - \frac{\lambda_2}{\lambda_1} A_2 - \ldots - \frac{\lambda_n}{\lambda_1} A_n. \end{align*}

Now what happens after that, with the determinant?

  • 1
    What are you allowed to assume about Det? Can you assume that it is invariant under the addition/subtraction of columns?2017-01-24
  • 1
    These (columns of $A$ linearly dependent, $\det(A) = 0$) are two conditions on square matrix $A$ of several that are commonly shown to be equivalent in introductory linear algebra courses; the path to proving equivalence might depend on previous "machinery". In particular it is often shown beforehand that elementary row operations do not affect whether $\det(A)=0$, and [this previous Question](http://math.stackexchange.com/questions/2078943/row-operations-do-not-change-the-dependency-relationships-among-columns) shows that such operations conserve linear dependence relations among columns.2017-01-24

3 Answers 3

4

Since exchanging two columns only switches sign to the determinant, it is not restrictive to assume that the last column is a linear combination of the previous $n-1$ columns: $$ A_n=\alpha_1A_1+\dots+\alpha_{n-1}A_{n-1} $$ By multilinearity of the determinant, you have $$ \det A= \det\begin{bmatrix} A_1 & \dots & A_{n-1} & \sum\limits_{i=1}^{n-1}\alpha_iA_i\end{bmatrix}= \sum_{i=1}^{n-1}\alpha_i \det\begin{bmatrix} A_1 & \dots & A_{n-1} & A_i\end{bmatrix}=0 $$ because a matrix with two equal columns has zero determinant again by the above mentioned property above that exchanging two columns changes the sign of the determinant.

With $\begin{bmatrix} v_1 & \dots & v_{n-1} & v_n\end{bmatrix}$ I denote the matrix whose columns are the column vectors $v_1,\dots,v_{n-1},v_n$.

  • 0
    I don't understand. What is $\begin{bmatrix} A_1 & \dots & A_{n-1} & A_i\end{bmatrix}$?2017-01-25
  • 0
    @Lewis The matrix with the stated columns, I added some explanation.2017-01-25
2

I'll give a variant of the proof, hoping you'll understand better.

Suppose there's a non-trivial linear relation between the columns: $$\lambda_1A_1+\lambda_2A_2+\dots+\lambda_nA_n=0.$$ Say $\lambda_1\ne 0$. By linearity w.r.t. the 1st column, $\;\det( \lambda_1A_1,A_2,\dots,A_n)=\lambda_1\det(A_1,A_2,\dots,A_n)$. Also, the determinant is alternating and linear in each column, so $$ \lambda_1\det A=\det(\lambda_1A_+\lambda_2A_2+\dots+\lambda_nA_n,A_2,\dots,A_n)= \det(0,A_2,\dots,A_n)=0,$$ whence $\det A=0$.

  • 0
    Can you please explain this part a bit: $ \lambda_1\det A=\det(\lambda_1A_+\lambda_2A_2+\dots+\lambda_nA_n,A_2,\dots,A_n)= \det(0,A_2,\dots,A_n)=0$2017-01-25
0

The determinant of $A$ is the product of the elements of the main diagonal when $A$ is converted to row echelon form. For a linearly dependent set of columns, when $A$ is converted to row echelon form, there will be a $0$ in the main diagonal of the matrix corresponding to the column of the free variable and hence, the determinant is $0$.