Fixed to be about $T$, not about $A$.
The trace of a square matrix equals the sum of the eigenvalues (in the algebraic closure of the ground field, if necessary; i.e., the sum of the roots of the characteristic polynomial); the determinant equals the product of the eigenvalues. This often gives a nice way of finding at least some of the eigenvalues, and in the case of $2\times 2$ matrix, gives all the information required to find all the eigenvalues (since knowing $a+b$ and $ab$ will determine $a$ and $b$).
Here you have a $2\times 2$ matrix, so it has two eigenvalues; their sum is $2$ and their product is $-3$. Thus, they are $3$ and $-1$.
In particular, $A$ is invertible, since no eigenvalue is equal to $0$.
Now let's consider $T$. Note that $T$ is one-to-one, because if $T(B)=0$, then $AB=0$. But since $A$ is invertible, this means that $B=A^{-1}AB=A^{-1}0= 0$.
Thus, $T$ is one-to-one on a finite dimensional vector space, so $T$ is invertible. This proves that (2) is true.
Now, notice that if $AB=\lambda B$, then $A\mathbf{b}_i=\lambda\mathbf{b}_i$, where $\mathbf{b}_i$ is the $i$th column of $A$, since $AB = A(\mathbf{b}_1\;\mathbf{b}_2) = (A\mathbf{b}_1\;A\mathbf{b}_2).$ In particular, if $B$ is an eigenvector of $T$, then $B\neq 0$, and therefore either $\mathbf{b}_1$ or $\mathbf{b}_2$ are nonzero, so either $A\mathbf{b}_1=\lambda\mathbf{b}_1$ or $A\mathbf{b}_2=\lambda\mathbf{b}_2$ shows that $\lambda$ is an eigenvalue of $A$. Conversely, if both $\mathbf{b}_1$ and $\mathbf{b}_2$ are in the eigenspace of $\lambda$ for $A$, and they are not both zero, then $B=(\mathbf{b}_1\;\mathbf{b}_2)$ is an eigenvector of $T$ associated to $\lambda$. That means that the only possible eigenvalues of $T$ are the eigenvalues of $A$. This proves that both (2) and (4) are false, since neither $2$ nor $1$ are eigenvalues of $A$, so they are not eigenvalues of $T$.
The only thing left is whether $T$ is diagonalizable.
As noted above, a matrix $B$ is an eigenvector of $T$ associated to $\lambda$ if and only if both columns of $B$ lie in the eigenspace of $A$ associated to $\lambda$. Since the eigenspaces of $A$ are one-dimensional, we can select $\mathbf{v}_1$ and $\mathbf{v}_2$, eigenvectors of $A$ associated to $3$ and $-1$, respectively. Then $B$ is an eigenvector of $T$ associated to $3$ if and only if $B=(\alpha\mathbf{v}_1\;\beta\mathbf{v}_2)$ and $\alpha$ and $\beta$ are not both zero. Thus, we have two degrees of freedom, so the eigenspace of $T$ associated to $3$ has dimension $2$. Similarly, the eigenvectors of $T$ associated to $-1$ are of the form $B=(\rho\mathbf{v}_2\;\sigma\mathbf{v}_2)$ with $\rho$ and $\sigma$ arbitrary but not both zero; again, the dimension is $2$.
Since the sum of the geometric dimensions of the eigenspaces of $T$ is $4$, which is the dimension of the vector space $M_2(R)$, this proves that $T$ is diagonalizable. So (1) is true.
In summary, (1) and (3) are true, (2) and (4) are false.