I'm having problems with the best way to work out linearly independent sets of matrices.
When the set can be made into a square matrix, such as $ \begin{bmatrix} 1 \\ 0\end{bmatrix}, \begin{bmatrix} 1 \\ 3\end{bmatrix} $, the determinate $ det(\begin{bmatrix} 1 & 1 \\ 0 & 3\end{bmatrix}) $ can be used, where a non-zero number means linearly independent, otherwise it is dependent.
However, I'm not sure how to work it out with a set that does not generate a square matrix and hence cannot use the determinate such as $ \begin{bmatrix} 1 \\ 0 \end{bmatrix} , \begin{bmatrix} 0 \\ 1 \end{bmatrix} , \begin{bmatrix} 1 \\ 3 \end{bmatrix} $ or $ \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} , \begin{bmatrix} 1 \\ 2 \\ 0 \end{bmatrix} $?
Another rule I could see that may apply here is that if $ c_1 \mathbf v_1 + c_2 \mathbf v_2 + ... + c_n \mathbf v_n = 0 $ then the vectors are said to be linearly dependent, however applying this rule to the example of $ \begin{bmatrix} 1 \\ 0 \end{bmatrix} , \begin{bmatrix} 0 \\ 1 \end{bmatrix} , \begin{bmatrix} 1 \\ 3 \end{bmatrix} $ produces two equations with three unknowns.
What is the proper way of finding if a set of matrices are linearly dependent or independent where the sets don't produce a square matrix?