If reduced row echelon has row of zeros, then $\det(A) = 0$. I know this is True but regarding proving this, I was thinking of using cofactor expansion with respect to that row (but I am not sure how to do this) OR any other suggested method. I got this intuitively in my head but a math proof would help.
Determinant of reduced row echelon
2 Answers
If we have an $n \times n$ matrix $A = (a_{ij})$, then the Laplace expansion formula for the determinant says that $$ \det A = \sum_{i=1}^n a_{ij} C_{ij} $$ where $C_{ij}$ is the $ij$th cofactor. There's a specific formula for $C_{ij}$ in terms of a power of $(-1)$ and the determinant of the matrix you get by deleting the $i$th row and $j$th column of $A$. But that doesn't matter for this question. We know that we have a row of all zero's. So there is a $j$ for which $a_{ij}$ are all zero. (Maybe you need to take the tranpose, but the idea is the same.) So when we do the above expansion, each term is zero.
Suppose you have a square matrix $A$. A row operation on $A$ is the same as multiplying on the left by an elementary row matrix $C$ ($C$ is the identity matrix after applying the same row operation on it, these are always invertible). Hence if $B$ is the row reduced version of $A$, we have that $$A = CB $$ where $C$ is a sequence $C=C_1^{-1}C_2^{-1}\cdots C_n^{-1}$ of inverses of these elementary row matrices. Therefore $$\det(A)=\det(C)\det(B) = 0$$ since $B$ has a row of zeros and so it's determinant is zero.