In Random Matrix Theory (RMT), few analytic results regarding the spectrum of a real random matrix $\mathbf{X}\in\mathbb{R}^{n\times n}$ with iid entries have been found:
- The probability that all $n$ eigenvalues of a random matrix drawn from the real Ginibri ensemble (all entries are normally distributed) are real is (Edelman, 1997) $$P_\mathbb{R}(n)=2^{-n(n-1)/4}.$$
- The probability that a random matrix drawn from the uniform and symmetric orthogonal invariant ensemble is Hurwitz-stable (all eigenvalues have negative real part) is (Calafiore and Dabbene, 2004) $$P_{H,US}(n)=2^{-n(n+1)/2}.$$
There may very well be others. See this for a quick overview of other results of RMT.
Both of the listed results are of the form $e^{-f_2(n)},$ where $f_2(n)$ is some polynomial of degree $2$ with positive leading coefficient. I believe that this is also true for the probability $P_H(n)$ of a real $n\times n$ matrix being Hurwitz-stable when it is drawn from any (since the set of all real Hurwitz-stable matrices is a cone and this scale-invariant) distribution symmetric about the origin. Simulations support this for normal, spherically uniform and cubically uniform distributions. I think it makes intuitive sense that the exponent must be a polynomial of degree $2$, since $n^2$ entries must coordinate to create the desired spectrum.
Is there anything in the literature on this? Perhaps it is possible to show using some sort of scaling argument? Or perhaps the method of either maximum entropy or minimum Fischer information could be useful? Any ideas on how to show this formally or any heuristic considerations regarding why it should be true/not true would be greatly appreciated.