(this is an improved version of What about other symmetric functions of the eigenvalues? )
Let $A$ be a matrix with eigenvalues $\lambda_1, \dots, \lambda_n$. Then $\det(A) = \lambda_1 \dots \lambda_n$ and ${\textrm tr}(A) = \lambda_1 + \dots + \lambda_n$. Now let $i_k(A) = e_k(\lambda_1, \dots, \lambda_n)$, where $e_k$ is the $k$th elementary symmetric function, so that det=$i_n$ and tr=$i_1$
It follows that $\det(I + tA) = \sum_{k=0} ^n i_k(A)t^k$. Using this, and the identity $(I+tA)(I+tB) = I+t(A+B+tAB)$ one can slowly grind out identities like $i_2(A+B) = i_2(A) + i_2(B) + i_1(A)i_1(B) - i_1(AB)$. The next simplest identity expresses $i_3(A+B)$ in terms of $i_1$ and $i_2$:s of $A,B,ABA$ and $BAB$.
Just the fact that these identities exist implies that knowledge of the eigenvalues of $A,B,ABA,BAB,\dots$ (meaning the matrices that occur in the formulas for higher $i_k$:s) implies knowledge of the eigenvalues of $A+B$.
My question is two-fold:
(i) Is there a 'nice' description of the analogous formula for $i_k(A+B)$?
(ii) Is there a more exact statement along the lines of "knowing the eigenvalues of this set of matrices is equivalent to knowing the eigenvalues of that set of matrices"?