I'm always having difficulties with what actually suffices as a proof, and what is obvious enough to not have to prove it. Here some I have those problems with.
Let $S=\{u_1,u_2,...,u_n\}$ be a finite set of vectors. Prove that $S$ is linearly dependent if and only if $u_1=0$ or $u_{k+1} \in <\{u_1,u_2,...,u_k\}>$ for some $k (1 \leq k \lt n)$.
I think it's obvious since 1. ${0}$ is linearly dependent and 2. if $u_{k+1}$ is in the mentioned span, than it is a linear combination of other vectors of $S$ and thus $S$ is linearly dependent. So much to my thinking, but how do I appropriately express something like this?
Let $M$ be a square upper triangular Matrix with nonzero diagonal entries. Prove that the columns of $M$ are linearly independent.
I think: If you regard every column as a vector, each has a different direction. But again, I guess this doesn't really count as a mathematical proof. How to express it then?
Let $V$ be a vector space over a field of characteristic not equal to two. a) Let $u$ and $v$ be distinct vectors in V. Prove that $\{u,v\}$ is linearly independent if and only if $\{u+v,u-v\}$ is linearly independent. ( b) - the same with $\{u,v,w\}$ and $\{u+v,u+w,v+w\}$.
My proof for a):
$a_1u + a_2v$ implies $a_1=a_2=0$
If $b_1(u+v) + b_2(u-v) = (b_1+b_2)u+(b_1-b_2)v = 0$ then $b_1+b_2 = b_1-b_2 = 0$ thus $b_1 = b_2=0$
Now do I need to write additional stuff to prove the other direction (something like:
Since $b_1(u+v) + b_2(u-v) = (b_1+b_2)u+(b_1-b_2)v = 0$ only if $b_1 = b_2 = 0$ I can choose no $a_1,a_2 \neq 0$ for I could always split it into $b_1$ and $b_2$ which were not zero, there fore the equation couldn't be zero either.),
or would the first part already be enough?
Thanks a lot ! I have a lot to get used to :P