3
$\begingroup$

The following questions are not T/F questions.

I'm trying to understand this complex subsets independency or dependency.

Let $A$ be an $2\times 2$ matrix over the real numbers.

  1. The subset $\{A^2, A^5, A^{11}\}$ is always linearly dependent.
  2. The subset $\{I, A, A^2\}$ is always linearly dependent.
  3. It is possible that the subset $\{A^2, A^5, A^{11}\}$ is linearly independent.
  4. It is possible that the subset $\{I, A^2, A^5, A^{11}\}$ is linearly independent.
  5. It is possible that the subset $\{I, A, A^2\}$ is linearly independent.

$I$ refers to the identity matrix.

Examples would be appricated.

Thanks in advance!

  • 2
    I suppose you meant "linearly (in)dependent in the vector space of all $\,2\times 2\,$ real matrices", right? What have you done? Do you know something about a matrix characteristic polynomial?2012-10-11
  • 0
    What do you think is the dimension of the space of $2 \times 2$ matrices?2012-10-11
  • 1
    @wj32 That is not really apposite, because the [Cayley-Hamilton theorem](http://en.wikipedia.org/wiki/Cayley%E2%80%93Hamilton_theorem) says that powers of an $n\times n$ matrix $A$ generate an $n$-dimensional subspace of the space of all $n\times n$ matrices.2012-10-11
  • 0
    @MJD: Oops, my bad!2012-10-11

1 Answers 1

2

Start with the definition of linearly dependent. A set of elements $\{v_1,\ldots,v_n\}$ of a vector space $V$ over field $K$ is linearly dependent if there are some $\lambda_1,\ldots,\lambda_n \in K$ such that $\lambda_1v_1 + \ldots + \lambda_nv_n = 0$. In your case, the vector space is the space $V$ of all linear mappings $\mathbb{R}^2 \rightarrow \mathbb{R}^2$, and the field is $\mathbb{R}$. Note that this space is isomorphic to $\mathbb{R}^4$. It follows that every set with more than 4 elements must be linearly dependent, because $\text{dim}(V) = 4$.

Now, every matrix $A \in \mathbb{R}^{nxn}$ has an associated characteristic polynomial $p_A \in \mathbb{R}[x]$ with $\text{deg }p_A = n$, i.e. $p_A(x) = a_nx^n + \ldots + a_1x + a_0$. And, very importantly, you always have $p_A(A) = 0$! (One says that the characteristic polynomial annihilates the matrix). Thus, for every matrix $A \in \mathbb{R}^{2x2}$ you can find $\lambda_0,\ldots,\lambda_2$ such that $\lambda_0I + \lambda_1A + \lambda_2A^2 = 0$. This answers question (2) and (5).

For the other questions, observe that you can also interpret the above as a way to write $A^2$ as a linear combination of $I$ and $A$, more precisely as $A^2 = \frac{1}{\lambda_2}(\lambda_0 I + \lambda_1 A)$. You can extend that to generate any higher power too - for $A^3$ you get $$ \begin{eqnarray} A^3&=&AA^2=A\frac{1}{\lambda_2}(\lambda_0 I + \lambda_1 A) =\frac{1}{\lambda_2}(\lambda_0 A + \lambda_1 A^2) =\frac{1}{\lambda_2}\left(\lambda_0 A + \lambda_1 \frac{1}{\lambda_2}(\lambda_0 I + \lambda_1 A)\right) \\ &=&\left(\frac{\lambda_0}{\lambda_2}+\frac{\lambda_1^2}{\lambda_2^2}\right)A + \frac{\lambda_0\lambda_1}{\lambda_2^2}I \end{eqnarray} $$ The exact coefficients are not important - the important fact is that for every n you can find coefficients $\mu_1,\mu_2 \in \mathbb{R}$ such that $A^n = \mu_1I + \mu_2A$. Thus, the subspace spanned by a set of powers of $A$ has at most dimension 2. It follows that every set with 3 or more members is linearly dependent.

  • 0
    I think towards the end, you are working too hard. All you need to know is that every power of $A$ can be written as $cA+d$ for some numbers $c$ and $d$.2012-10-11
  • 0
    @GerryMyerson Isn't the prett much the same thing? Note that $I$ and $A^1$ isn't always in the set, so even with your approach the answer isn't *totally* obvious...2012-10-11
  • 0
    I don't care whether $I$ and/or $A$ is/are in the set; everything in the set can be written as $cA+d$, which is a 2-dimensional vector space, so, if there are three or more elements in the set, end of story. I certainly don't have to go looking for a polynomial $q$ so that some product is some special polynomial of degree 11.2012-10-11
  • 0
    @GerryMyerson Uh, you're right of course. Shall I fix my answer, or will you post a separate one?2012-10-11
  • 0
    @Guy I suggest to read up on matrices and their characteristic polynomials and especially the Caley-Hamilton theorem. Start here http://en.wikipedia.org/wiki/Cayley%E2%80%93Hamilton_theorem2012-10-11
  • 0
    First of all thank you all. fgp and Gerry Myerson I'm trying to understand your answers. These λ of the polynomial which isomorphic to R^4 - what do I know about them? You mean I can be sure I will have a set of vectors with scalars which are not zero and I will get P(A) = 0? I'm sorry for the inconvinent but I'm confused... Also in your defintion of linearly independent vecotrs all the scalars are zero, right? is there a chance I would have such a P(A) which all the scalar are zero? then it might be linearly indepenedet. "if there are three or more elements in the set, end of story. " why?2012-10-11
  • 0
    Also fgp and @Gerry Myerson about "All you need to know is that every power of A can be written as cA+d for some numbers c and d ." If I take a simple 2x2 matrix {(1,2),(3,4))} where 1,2 are the first two columns in the first row and (3,4) are the second two columns in the second row. now the multiplication of them is {(7,10),(15,22)} and i cant find a cA+d which creates it.2012-10-11
  • 0
    @fgp let me know if I understand correctly. "In your case, the vector space is the space V of all linear mappings R^2 → R^2 , and the field is R . Note that this space is isomorphic to R^4". That means that if I create a transormation R^2 -> R^2 then the dimension of the Image of the transofmration must be less than 3 (because the range is r^2) so I can't have more than 2 powers of A in the same subset and have them linearlly indepened?2012-10-11
  • 0
    @Guy Not quite. The dimension of the *image* will always be at most two because the image is a subset of $\mathbb{R}^2$, but that doesn't help. The point is that $I,A$ is a basis of every subspace spanned by powers of $A$, if $A \in \mathbb{R}^2x2$. Thus, any such subspace has dimension $\leq 2$, and any set with 3 or more elements is thus linearly dependent. I've updated the answer to include this argument.2012-10-11
  • 0
    @fgp - I want to thank you very much for your help. Your explanation seems very good but still this one is a bit complicated for me. the way your define the A^2 with the =1/λ(λ 0 I+λ 1 A) . I will try to read it again in the morning and see how it goes. Thank you again!2012-10-11
  • 0
    @fgp - one more thing. this means the 1) is true too right? Every three powers of A in the same subset always be linearly depenend because I can express A^n with two others A powers.2012-10-11
  • 0
    @Guy Yes. Regarding your previos comment - I derived the expression for $A^2$ from the characteristic polynomial. Simply bring $A^2$ to the other side, and divide by $\lambda_2$. And note that, technically, that only works if $\lambda_2 \neq 0$. If it's zero, the characteristic polynomial has degree $1$, so instead you can express $A$ as a multiple of $I$, and work from that. The subspace spanned by powers of $A$ then has dimension $1$.2012-10-11
  • 0
    Guy, in your example, you want $c,d$ such that $$\pmatrix{7&10\cr15&22\cr}=c\pmatrix{1&2\cr3&4\cr}+d\pmatrix{1&0\cr0&1\cr}$$ I'm sure you can find them.2012-10-11
  • 0
    Hi fgp and @GerryMyerson. I read about Cayley-Hamilton and it helped me a lot. I learnt this material in class but forgot some basics. I have some final questions after clarifying everything out. 1. "In your case, the vector space is the space V of all linear mappings ...." - You mean the 2x2 matrices vector space is actually an $\mathbb{R}^2 \rightarrow \mathbb{R}^2$ transformation? So the 2x2 matrices vector space is isomorphic to the $\mathbb{R}^4$. How does it helps me? I don't understand this preface. Also as for the "$\lambda_1,\ldots,\lambda_n \in K$" you noted.2012-10-12
  • 0
    In case all of them are zero it means that the A matrix we picked is the zero matrix, otherwise it must be a non zero constant --- or as you noted in your last post the $A \lambda_2$ might be zero and then the $A^n$ will be a multiple of I. right? 2. The simple fourmla cA + d is actually an end result of @fgp fourmla above which concludes "$A^n = \mu_1I + \mu_2A$". I just want to make sure we are on the same line.2012-10-12
  • 0
    3. Finally the formula "$A^n = \mu_1I + \mu_2A$" asserts that we have a basis for $A^n$ which is $\{{I,A\}}$ and then if we have three powers in the same subset they must be linearly dependent (they have dimension larger than the basis dimension). Other than that I understand everything you wrote here. Amazing how fast your provided answers. Thank you very much.2012-10-12