Here is a useful observation. For any $k$, let $I_k$ denote the $k \times k$ identity matrix.
Fixing a positive integer $n$, you can check that the $(2n) \times (2n)$ square matrix $ C = \frac{1}{\sqrt{2}} \begin{pmatrix} I_n & -I_n \\ I_n & I_n \end{pmatrix} $ satisfies $C^T C = CC^T = I_{2n}$ (here $C^T$ denotes the transpose of $C$). In other words, $C^T = C^{-1}$, or: $C$ is a unitary matrix.
Moreover, you can check that that for any square $n \times n$ matrices $A$ and $B$, you have $ C^T \begin{pmatrix} A & B \\ B & A \end{pmatrix} C = \begin{pmatrix} A + B & 0 \\ 0 & A - B \end{pmatrix} $ (where the $0$s on the extreme right hand side denote the $n \times n$ zero matrices).
This shows that the block matrices $\begin{pmatrix} A & B \\ B & A \end{pmatrix}$ and $\begin{pmatrix} A + B & 0 \\ 0 & A - B \end{pmatrix}$ are similar. Similar matrices have the same characteristic polynomials, and in particular, the same eigenvalues, with the same algebraic multiplicities. (The matrix $C$ implementing the similarity even allows you to pass back and forth between eigenvectors of one matrix and eigenvectors of the other, so even the geometric multiplicities of the eigenvalues coincide.)
It should be clear that the eigenvalue list of the block matrix $\begin{pmatrix} A + B & 0 \\ 0 & A - B \end{pmatrix}$ consists of the eigenvalue list of $A + B$, together with the eigenvalue list of $A - B$. This explains why in the special case $B = I$ the eigenvalues of your matrix are the eigenvalues of $A \pm I$, which are, of course, the eigenvalues of $A$, plus or minus $1$.
For more general $A$ and $B$, of course, the eigenvalues of $A \pm B$ are not obtained simply by adding (or subtracting) eigenvalues of $B$ to (or from) the eigenvalues of $A$. (As an example, if $A = \begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix}$ and $B = \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}$, then the eigenvalues of $A$ are $0,2$, and $0$ is the only eigenvalue of $B$, but the eigenvalues of $A+B$ are $1 \pm \sqrt{2}$, and $1$ is the only eigenvalue of $A - B$.) But the computation above still shows that the problem of determining the eigenstuff of any $(2n) \times (2n)$ matrix of the form $\begin{pmatrix} A & B \\ B & A \end{pmatrix}$ reduces to determining the eigenstuff of the two $n \times n$ matrices $A + B$ and $A-B$.