There is no approximation argument (that I know) that allows to deduce the general case from the $L^2$ case. Rather one should go back to the very definition of conditional expectations and note that $A\leqslant B$ almost surely if and only if the events $ C_x=[A>x\geqslant B] $ have probability zero for every real number $x$, or for sufficiently many values of $x$. Here is a proof along these lines.
First recall that $\mathrm E(A|B)=B$ means that $\mathrm E(Au(B))=\mathrm E(Bu(B))$ for (at least) every bounded measurable function $u$. Similarly, $\mathrm E(B|A)=A$ means that $\mathrm E(Bv(A))=\mathrm E(Av(A))$ for (at least) every bounded measurable function $v$. In particular, for every real number $x$, $ \mathrm E(A;B\leqslant x)=\mathrm E(B;B\leqslant x),\qquad \mathrm E(B;A>x)=\mathrm E(A;A>x). $ Next, introduce the events $ D_x=[A> x,B> x],\qquad F_x=[A\leqslant x,B\leqslant x]. $ Then $[B\leqslant x]=C_x\cup F_x$ and $[A>x]=C_x\cup D_x$ and both these unions are disjoint hence $ \mathrm E(A-B;C_x)+\mathrm E(A-B;F_x)=0=\mathrm E(A-B;C_x)+\mathrm E(A-B;D_x). $ Summing up these two equalities and using the fact that $\mathrm E(A-B;C_x)\geqslant0$ because $A-B>0$ on $C_x$, one gets $ \mathrm E(A-B;D_x\cup F_x)\leqslant0. $ The hypothesis we started from is symmetric with respect to $(A,B)$ hence the conclusion above holds if one replaces $(A,B)$ by $(B,A)$. Then $A-B$ becomes $B-A$ and $D_x$ and $F_x$ do not change. This proves that $\mathrm E(B-A;D_x\cup F_x)\leqslant0$, hence $\mathrm E(A-B;D_x\cup F_x)=0$, which implies $ \mathrm E(A-B;C_x)=0. $ But $A>B$ almost surely on $C_x$, hence the random variable $(A-B)\mathbf{1}_{C_x}$ is almost surely nonnegative. This proves that the event $[A>B]\cap C_x=C_x$ has probability zero.
Now, the event $[A>B]$ is the countable union of the events $C_x$ on the rational numbers $x$, hence $[A>B]$ has probability zero. By symmetry, the event $[A has probability zero as well, hence $A=B$ almost surely.