If two matrices $A$ and $B$ are commutative then all rules for real numbers $a$ and $b$ apply for the matrices?
For example, if $AB=BA$ then:
$(A+B)^2=A^2 + 2AB + B^2$
$A^3 - B^3 = (A-B)(A^2+AB+B^2)$
and so on...If the matrix $A$ is invertible then is $A^m A^n = A^{(m+n)}$, where $m,n$ are integers?
Commutative matrix
-
4(1) The evaluation map $\Bbb C[x,y]\to \Bbb C[A,B]$ will be a ring homomorphism if $A,B$ commute. (I think this is the correct algebraic language for what you speak of.) (2) This holds very generally, not just for $A$ invertible (ie arbitrary monoids). – 2012-08-12
-
0@anon: note that $m$ and/or $n$ might be negative. – 2012-08-12
-
0[Oh, right; then in arbitrary groups.] – 2012-08-12
-
0yes, you can show it by direct computation for the first identity. – 2012-08-12
2 Answers
"Do all rules for real numbers apply to the matrix?"
If by all rules for real numbers, you mean finite factorization laws like in your two examples, then yes. How might we prove such a thing? Let's consider $(A + B)^2$ and $(A+B)^3$.
$(A + B)^2 = A^2 + AB + BA + B^2$, and as $AB = BA$ we can write this as $A^2 + 2AB + B^2$. Similarly, once we write out $(A^2 + 2AB + B^2)(A + B)$, we can simply commute the matrices to get that $(A+B)^3 = a^3 + 3A^2 B + 3AB^2 + B^3$, and so on.
If by all rules for real numbers, you actually mean all rules for real numbers, then the answer is no. For example, it's not true that a matrix A satisfies the trichotomy, $A > 0, A = 0,$ or $A<0$.
"If matrix $A$ is invertible, then is $A^m A^n = A^{m+n}$ for $m,n \in \mathbb{Z}$?"
Let's look at a case. Suppose $m = 2, n = -3$. Then $A^2 A^{-3}$ makes sense. And $A^2A^{-3} = A(AA^{-1})A^{-2} = (AA^{-1})A = A$. Do you see how this proof might be expanded? In fact, for a general matrix $B$, $B^m B^n = B^{m +n}$ if $m,n > 0$, so the important detail here is whether or not $A^{-1}$ makes sense to write down.
-
0I suspect that "all rules" means "all identities expressed using the operations +,-,*,/". – 2012-08-12
-
0... in which case $(a^2+1)/(a^2+1) = 1$ is one that isn't, i.e. for real numbers $a^2+1$ is always invertible, but for matrices (where $1$ is interpreted as the identity matrix $I$) it isn't. – 2012-08-12
I presume that you have not already studied abstract algebra (if you have then please pause now and consider what you've learned about polynomial and matrix rings, groups, etc).
That said, consider what laws you use when proving these identities for real numbers, e.g. the associative and commutative laws for addition, multiplication, etc. Any proof of an identity for reals will also hold true for matrices as long as it only uses laws that hold true for both. For an analogous, less well-known, but enlightening example, see my post on the Freshman's Dream for gcds and ideals, i.e. $\rm\:gcd(A^n,B^n) = gcd(A,B)^n,\:$ or $\rm\:A^n + B^n = (A + B)^n\:$ in additive notation. The proof uses the same arithmetic laws as for integers, combined with a couple laws peculiar to gcds.
If you study abstract algebra you will encounter algebraic structures that abstract these common laws into ubiquitous theories of algebraic structures (groups, rings, domains, fields, etc).