1
$\begingroup$

Let $R$ be a unital commutative ring, and let $J$ be an ideal of $M_2(R)$. Then there exists an ideal $I$ in $R$ such that $J=M_2(I)$.

Proof:

Let $I:=\{a\in R: a_{1,1}\text{ is the first entry of a matrix in J} \}$. Then $I$ is not empty, since $J$ is not empty. Let $A=\begin{bmatrix} a_{1,1} & b\\ c & d \end{bmatrix}\in J$, with $b,c,d\in R$, and let $B=\begin{bmatrix} r_1 & r_2\\ r_3 & r_4 \end{bmatrix}$, with $r_k\in R$, for $1\le k \le 4$. Then $AB=\begin{bmatrix} a_{1,1}r_1+br_3 & a_{1,1}r_2 + br_4\\ cr_1+dr_3 & cr_2+dr_4 \end{bmatrix}\in J$. Similarly, $BA=\begin{bmatrix} r_1a_{1,1}+r_2c & r_1b+r_2d\\ r_3a_{1,1}+r_4c & r_3b+r_4d \end{bmatrix}\in J$. Thus $a_{1,1}r_1+br_3$ and $r_1a_{1,1}+r_2c$ are in $S$, where $S\le R$ (is it true that $S$ must be a subring of $R$ if $J$ is an ideal of $M_2(R)$?), such that $J = M_2(S)$. But this implies that $a_{1,1}r_1$ and $r_1a_{1,1}$ are in $S$ and $r_2c$ and $br_3$ are in $S$ (since $S$ is a ring). But then $S$ is an ideal of $R$, since $a_{1,1}\in I$ and $r_k\in R$ is arbitrary, and $a_{1,1}\in I$. Since $a_{1,1}\in I$, so $S=I$. All other entries of $R$ must also be in $S=I$, hence, $J=M_2(I)$.

But if my proof is correct, then why do we need $R$ to be unital and commutative?

Update: I see that $I$ may be a subring of $S$ in my proof, so the proof is incorrect.

  • 2
    Just for the record, the result you want to prove is a particular case of theorem 2.24 of McCoy's "The Theory of Rings". And for your question, $R$ doesn't have to be commutative but it does have to be unitary, and all this is proved in the above aforementioned theorem.2017-02-19
  • 0
    @Xam Why is my proof incorrect?2017-02-19
  • 0
    @sequence I have spent many comments trying to tell you that you have not proven $J=M_2(I)$.2017-02-19
  • 0
    @Xam Unfortunately, I don't have that book. Googling didn't reveal much.2017-02-20

1 Answers 1

2

why do we need $R$ to be unital and commutative?

Firstly, we do not need $R$ to be commutative. This is perfectly true for noncommutative rings.

However, what you have written is far from a clear proof. No doubt you intended $A$ to be chosen from $J$ instead of having arbitrary entries $b,c,d$. Notice that you lose control of what they are right at that moment. You only have control over $r_1, r_2,r_3,r_4$. This indeed allows you to prove that $I$ is an ideal. Then you say

Also, $a_{1,1}\in I$, so $S=I$.

I'm not sure what you meant by $S$, since it does not really serve a purpose. It certainly looks like (and would make sense if) you are actually thinking of $I$ again. Then

All other entries of $R$ must also be in $I$, hence, $J=M_2(I)$.

No idea what you mean. Obviously not all elements of $R$ must be in $I$. Do you mean entries of the matrix $A$ are in $I$? This is far from obvious. Why should that be?

If the ring had an identity element then I would use permutation matrices on the left and right of $A$ to move entries of $A$ to the upper left hand corner, so that those entries would fall in $I$. Then we could be sure that $J\subseteq M_2(I)$.

Then finally how do we know $M_2(I)\subseteq J$? If the ring had an identity then I would just take a matrix contributing $a_{11}$ and multiply it on the left and right with matrices to make it zero in all other entries. Then I could further show that this one nonzero entry could be moved into all other positions. These are all matrices contained in $J$. Then it would be clear that $M_2(I)\subseteq J$, and we would have equality.


I should have thought of this simple example a long time ago. Let $R$ be a ring with trivial multiplication, such that $R$ has unequal additive subgroups $A$ and $A'$. Then $\begin{bmatrix}A&A\\A&A'\end{bmatrix}$ is obviously an ideal of $M_2(R)$ (because the multiplication becomes trivial in the matrix ring too.) By construction, this ideal is not of the form $M_2(I)$ for any ideal $I$. So you can see that the theorem does not hold for all rings without identity.

That's not to say the theorem fails in all rings without identity. I think everything works in rings which have local identities, for example. (This means that for every $a\in R$, there exists an idempotent $e\in R$ such that $ea=a$. One such ring is $\oplus_{i=1}^\infty F$ for a field $F$.)

  • 0
    But how do we know that $J$ contains units of $M_2(R)$?2017-02-19
  • 0
    @sequence I'm not sure if you mean invertible elements of $M_2(R)$ or you mean the matrices that have a $1$ in exactly one entry. Either way, $J$ doesn't have to contain things of that type. Why would that ever come into play? In what I'm talking about above, it is only important that you can choose $1$ when choosing $r_1, r_2, r_3 r_4$.2017-02-19
  • 0
    Can you please check my proof again? I've clarified it. It must be true that if $S\le R$ then $S=I$. So then we don't need unity at all. @rschwieb2017-02-19
  • 0
    @sequence If you are going to use $S$, could you at least define what it is?2017-02-19
  • 0
    $S$ is a subset of $R$ such that $S$ is a subring of $R$, and $S$ is a set satisfying $M_2(S) = J$.2017-02-19
  • 0
    @sequence I don't think it's reasonable to assume something like that exists in the course of proving there is an ideal $I$ that does the same thing. What if (conceivably) the four ideals of entries of elements like the one you defined to be $I$ and the ones for the other positions, don't share some elements? For instance, how do you prove $\left[\begin{smallmatrix}0&0\\0&a_{11}\end{smallmatrix}\right]\in J$?2017-02-19
  • 0
    Why would we need to prove that this matrix is in $J$? In this proof we only care about matrices where $a_{11}$ is the first entry of $J$.2017-02-19
  • 0
    @sequence we care because *that matrix is in $M_2(I)$*. As are the other two variants.2017-02-19
  • 0
    You're right. Well, once I've done the proof above, we know that $0\in I$ and $a_{11}\in I$, so the matrix is in $M_2(I)$. @rschwieb2017-02-19
  • 0
    @sequence Well, let me know when you add it. (Because I don't see anything like that proven now.)2017-02-19
  • 0
    Perhaps I'm lacking some concepts from ring theory, which I hope to clear soon. But from what I see, if my proof is correct, then it follows immediately that your matrix is in $J=M_2(I)$, since $0\in I$ and $a_{11}\in I$. @rschwieb2017-02-19
  • 0
    @sequence `(-‸ლ)` Look. If $J=M_n(I)$, then $\left[\begin{smallmatrix}0&0\\0&i\end{smallmatrix}\right]\in M_n(I)$ for any $i\in I$. How does that follow from what you've written? You only know that $i$ appears in the upper left hand corner of some matrix in $J$. If you've succeeded in proving the theorem, then this should just be a special case.2017-02-19
  • 0
    We know that $I$ is the set which contains all of the upper left-hand entries of the matrix. And that entry can be $0$ as well. In fact, $0$ must be included because $0\in I$.2017-02-19
  • 0
    @sequence I completely agree with that marginally relevant statement which does not resolve the problem I posed. I renew my query contained in the first three sentences of that comment.2017-02-19
  • 0
    If you mean your comment containing: "I don't think it's reasonable to assume something like that exists in the course of proving there is an ideal I that does the same thing. What if (conceivably) the four ideals of entries of elements like the one you defined to be I and the ones for the other positions, don't share some elements?", then, unfortunately, I do not understand what it says.2017-02-19
  • 0
    @sequence no, I mean the comment before my last comment. (Although the objection you are referring to us also not resolved.)2017-02-19
  • 0
    I don't see how it doesn't follow from what I've written that $\begin{bmatrix} 0 &0 \\ 0&i \end{bmatrix}\in M_n(I)$. Can you please be more specific?2017-02-20
  • 0
    @sequence Explain, step-by-step, why you think *this particular element* is there. It should just be a specialization of your argument. In the meantime, I inserted a counterexample into the post.2017-02-20
  • 0
    "Then $\begin{bmatrix}A&A\\A&A'\end{bmatrix}$ is obviously an ideal of $M_2(R)$" Why? $\begin{bmatrix}A&A\\A&A'\end{bmatrix}\begin{bmatrix}R&R\\R&R\end{bmatrix}$ contains entries of the form $AR+A'R$. I see no clue why that must be an ideal. $$$$ It's still not clear what exactly you mean in your comment w.r.t. $\begin{bmatrix}0&0\\0&i\end{bmatrix}$.2017-02-20
  • 0
    @sequence $\begin{bmatrix}A&A\\A&A'\end{bmatrix}$ is obviously an abelian group, and the product of an element from this set and $M_2(R)$ on either side is zero. (Don't forget we assumed the multiplication in $R$ is trivial, i.e. all products zero. ) So it is an ideal.2017-02-20
  • 0
    @sequence What I mean about $\begin{bmatrix}0&0\\0&i\end{bmatrix}$ is that no reason has been given for it to be in $J$. You keep saying you've proved it, but if you really have, you'll be able to provide details in this one case.2017-02-20
  • 1
    Do you mean that for any elements $a, b\in R$, $a\cdot b =0$?2017-02-20
  • 0
    Well, the set $I$ defined in the proof contains all elements $a_{11}$, and it also contains $0$. So $ \begin{bmatrix} 0 &0 \\ 0&i \end{bmatrix}\in M_2(I)$2017-02-20
  • 0
    You say we can permute a matrix $A$ in $J$ such that $i\in I$ is moved to the first entry. But what if $A$ is not invertible? @rschwieb2017-02-20
  • 1
    @sequence $A$'s invertibility is irrelevant. What I mean is this: $\begin{bmatrix}0&1\\1&0\end{bmatrix}\begin{bmatrix}0&0\\0&i\end{bmatrix}\begin{bmatrix}0&1\\1&0\end{bmatrix}=\begin{bmatrix}i&0\\0&0\end{bmatrix}$. This is possible when $R$ has identity.2017-02-20
  • 0
    @sequence The set $I$ you defined contains all the upper left-hand entries: *but you did not show that all entries of matrices in $J$ are in this set.* The counterexample shows how this can fail. Using matrices with $1$'s in them, you can prove that everything that appears in any position must also appear in the upper left hand corner.2017-02-20
  • 0
    Do you mean that $J=M_2(I)$ may also mean that as long as the upper left-hand entry is in an element of $I$, $J$ is defined over the ideal of $I$ in $R$, and all the other entries of the element do not have to be in $I$? (I actually thought that if a matrix is defined over an ideal (a ring), then all entries must be in that ring). And then, if the entry which is in $I$ is not the upper left entry then we can move it there. But what does moving it there do if it is initially not there anyway, in the original element? I've never been this confused, but I've solved more complicated problems.2017-02-20
  • 0
    @sequence I can see you're confused. I don't understand what your last comment is asking. What I'm trying to get you to see is that it is possible for an ideal in $M_2(R)$ to not be of the form $M_2(I)$ when $R$ doesn't have identity. Look: take my example above with $A=\{0\}$ and $A'\neq\{0\}$. Can you see how that is not of the form $M_2(I)$?2017-02-20
  • 0
    No, I can't see that, because I see that all entries are in $I$.2017-02-20
  • 0
    I think I've done the proof correctly now, when I ignored the hint given by instructor. Will post it later.2017-02-20
  • 0
    @sequence then you do not understand what the set $M_2(I)$ is.2017-02-21
  • 0
    $M_2(I)$ is the set of matrices which contain all four entries which are in $I$.2017-02-21
  • 0
    @sequence Yes *all* such matrices. So if $\begin{bmatrix}0&0\\0&x\end{bmatrix}\in M_2(I)$, then $x\in I$, and $\begin{bmatrix}x&0\\0&0\end{bmatrix}\in M_2(I)$ *also*. This is not possible in the example $J=\begin{bmatrix}\{0\}&\{0\}\\\{0\}&A'\end{bmatrix}$.2017-02-21
  • 0
    I seem to be having trouble understanding some of your notation. Such as the last matrix.2017-02-23
  • 0
    Let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/54178/discussion-between-rschwieb-and-sequence).2017-02-24