It is a standard theorem (given in Rudin's Principles of Mathematical Analysis, page 175, and many other places), that if $\{a_{ij}\}$ is a doubly indexed sequence, and
$\sum_{j=1}^\infty |a_{ij}| = b_i$and $\sum b_i$ converges, then $\sum_{j=1}^\infty\sum_{i=1}^\infty a_{ij} = \sum_{i=1}^\infty\sum_{j=1}^\infty a_{ij}$
Rudin gives a proof relying on standard facts about uniform convergence. However, he also notes that the theorem can be proved directly. Below, I give a "proof" that I have some reservations about.
Proof:
First, we want to prove that for any integer $M$, $\sum_{j=1}^\infty\sum_{i=1}^Ma_{ij} = \sum_{i=1}^M \sum_{j=1}^\infty a_{ij}.$
To prove this, we use the addition rule for series: if $\sum a_i = A$ and $\sum b_i = B$, then $\sum (a_i+b_i) = A + B$. On the right hand side of the equation, we have $M$ sums of the form
$\sum_{j=1}^\infty a_{ij} = c_i$
Using the addition rule $M$ times yields
$\sum_{i=1}^M \sum_{j=1}^\infty a_{ij}= \sum_1^M c_i =\sum_{j=1}^\infty (a_{1j}+ a_{2j}+\cdots+a_{Mj})=\sum_{j=1}^\infty\sum_{i=1}^Ma_{ij} $
Now, by taking the limit as $M$ goes to infinity above, we have reduced the problem to proving $\lim_{M \to \infty} \sum_{j=1}^\infty \sum_{i=1}^M a_{ij} = \sum_{j=1}^\infty \sum_{i=1}^\infty a_{ij}.$
Consider the $M$th partial on the left hand side of the equation. It is
$(a_{1,1}+a_{2,1}+\cdots+a_{M,1})+(a_{1,2}+a_{2,2}+\cdots+a_{M,2})+(a_{1,3}+a_{2,3}+\cdots+a_{M,3})+\cdots$.
It is not hard to see that by the absolute convergence condition, that this is equal to $a_{1,1}+a_{2,1}+\cdots+a_{M,1}+a_{1,2}+a_{2,2}+\cdots+a_{M,2}+a_{1,3}+a_{2,3}+\cdots+a_{M,3}+\cdots$ (because it is a subsequence), and that this new series is absolutely convergent, so we may rearrange the terms at will. Call this series $S_M$.
The right hand side of the equation is: $ (a_{1,1}+a_{1,2}+a_{1,3}+a_{1,4}+\cdots)+ (a_{2,1}+a_{2,2}+a_{2,3}+a_{2,4}+\cdots)+ (a_{3,1}+a_{3,2}+$ $a_{3,3}+a_{3,4}+\cdots)+\cdots$ ($=\sum b_i$).
Fix $\epsilon>0$. We can find $N$ such that $\epsilon > \sum\limits_{N+1}^{\infty} b_i$. We can also find $J$ such that $\epsilon/N > \sum\limits_{j=J+1}^{\infty} a_{ij}$, for $i\le N$. For any $K$, we can rearrange $S$ to be of the form $(a_{1,1}+a_{1,2}+a_{1,3}+a_{1,4}+\cdots+a_{1,K})+ (a_{2,1}+a_{2,2}+a_{2,3}+a_{2,4}++\cdots+a_{1,K})+ \cdots$ $+(a_{N,1}+a_{N,2}+a_{N,3}+a_{N,4}+\cdots+a_{N,K})+$ [a convergent tail].
Rearrange $S_K$ to be of this form, with $K>J$ and $K$ large enough that the convergent tail has sum less than $\epsilon$. Subtracting this rearrangement from the right hand side gives the result, because the difference can be estimated in terms of $\epsilon$, which we can make as small as we desire. The first $N$ rows each have the first $K$ terms taken off of them, and the remaining sum in each row is less than $N/\epsilon$, so together the $N$ tails left have sum less than $\epsilon$. By the way $N$ was chosen, the remaining rows constitute a tail of the $b_i$ with sum less than $\epsilon$. Finally, the convergent tail that is being subtracted has sum less than $\epsilon$. Using the triangle inequality shows the difference is less than $3\epsilon$, so the partial sums of the left hand side converge to the right hand side, and we are done.
Any comments or suggestions are welcome - I'd like to eventually produce the direct proof Rudin alludes to.