If $A$ and $B$ are two real $2\times 2$ matrices with $\det A = 0 $ and $\det B = 0 $ and $\mathrm{tr}(B)$ is non zero. then what will be limit of $\lim_{t\to0}\frac{\det(A+tI)}{\det(B+tI)}$ I used the formula $\lambda^2-\mathrm{tr} A+\det A = 0$. then i think answer is $\dfrac{\mathrm{tr}(A)}{\mathrm{tr}(B)}$. Am I correct?
What would be expansion of $\det(A+tI)$ for a $2\times 2$ matrices?