I explain a combinatorial proof of this identity in this blog post and this one; I'll sketch the second proof. First, this is really an identity of symmetric functions in the eigenvalues: the first says that
$\sum_{n \ge 0} h_n t^n = \exp \left( \sum_{n \ge 1} \frac{p_n}{n} t^n \right)$
where $h_n$ are the complete homogeneous symmetric polynomials and $p_n$ the power symmetric functions in the eigenvalues of $U$. The second says that
$\sum_{n \ge 0} e_n t^n = \exp \left( \sum_{n \ge 1} (-1)^{n+1} \frac{p_n}{n} t^n \right)$
where $e_n$ are the elementary symmetric functions in the eigenvalues. It is not hard to see that $\sum h_n t^n = \frac{1}{\sum e_n t^n}$, so actually these identities are equivalent; we will thus concentrate on the first one, which has no signs.
First, here is the standard proof (with analytic details omitted, since this is an equality of formal power series anyway), which I find unsatisfactory. Consider the operator $\frac{1}{1 - Ut} = \sum_{n \ge 0} U^n t^n.$ A standard matrix identity asserts that $\det \exp M = \exp \text{tr } M$ for any matrix $M$. Applying this identity to the logarithm of the above, we conclude that
$\det \frac{1}{1 - Ut} = \exp \text{tr } \log \frac{1}{1 - Ut} = \exp \left( \sum_{n \ge 1} \frac{p_n}{n} t^n \right).$
On the other hand, $\det \frac{1}{1 - Ut} = \frac{1}{\prod (1 - \lambda_i t)}$ where $\lambda_i$ are the eigenvalues, and this is precisely $\sum h_n t^n$.
Now here is a sketch of the combinatorial proof. First we assume that $U$ is the adjacency matrix of a finite graph $G$. Then $\text{tr } U^n$ is the number of closed walks of length $n$ on $G$. On the other hand, $\text{tr } \text{Sym}^n(U)$ describes something more complicated: the graph $G$ has a symmetric product $\text{Sym}^n(G)$ whose vertices are the unordered $n$-tuples of vertices of $G$ and whose edges are chosen so that $\text{Sym}^n(U)$ is its adjacency matrix.
Now $\text{tr } \text{Sym}^n(U)$ describes the number of closed walks of length $1$ on the symmetric product; that is, the number of loops. A loop on $\text{Sym}^n(U)$ is (roughly speaking) an unordered collection of closed walks on $U$ with total length $n$. This turns out to imply, via the well-known exponential formula in combinatorics, the desired identity.
Now, since the adjacency matrices of finite graphs are Zariski dense in $\text{GL}_n(\mathbb{C})$, the result follows for all matrices.
This identity has hidden depths. For a finite graph $G$ with adjacency matrix $A$, the function $\frac{1}{\det(I - At)}$ is a kind of zeta function for $G$, and the argument above is closely related to the Euler product of this zeta function.