First approach: If $\{\lambda_1,\dots,\lambda_m\}$ are the eigenvalues of $A$ and $\{\nu_1,\dots,\nu_n\}$ those of $B$, then the eigenvalues of $A\otimes B$ are $\lambda_j\cdot\mu_k,1\leq j\leq m,1\leq k\leq n$.
We assume that the respective dimensions of $A$ and $B$ are $m$ and $n$. If $v$ is an eigenvector of $A$ for $\lambda_k$ and $w$ of $B$ for $\mu_j$, consider $V$ the vector of size $mn$, defined by $V=(v_1w_1,\dots,v_1w_n,v_2w_1,\dots,v_2w_n,\dots,v_mw_1,v_mw_n).$ It's an eigenvector of $A\otimes B$ for the eigenvalue $\lambda_k\mu_j$. As the matrices $A$ and $B$ are diagonalizable, counting multiplicity we are sure there aren't other eigenvalues.
As $A$ and $B$ are positive definite, $\lambda_k\mu_j>0$ for all $k,j$.
Second approach: We use mix product property, that is $(A_1A_2)\otimes (B_1B_2)=(A_1\otimes B_1)(A_2\otimes B_2).$ Applied twice, this gives $A\otimes B=(P_1^tD_1P_1)\otimes (P_2^tD_2P_2),$ where $P_i$ are orthogonal and $D_i$ diagonal. This gives $A\otimes B=(P_1\otimes P_2)^t(D_1\otimes D_2)(P_1\otimes P_2),$ so the problem reduces to the case $A$ and $B$ diagonal, which is easy, as the eigenvalues are positive.