8
$\begingroup$

Given an $n \times n$ real matrix $C$, we can try to maximize $\Phi(C, \pi) = \frac{1}{n} \sum_{i} C_{i,\pi(i)} $ over $\pi \in S_n$, the set of all permutations on $n$ objects. What can one say about this problem if the $C_{ij}$ are random variables? Numerical evidence indicates that for normally distributed independent entries $C_{ij} \sim N(0,1)$, the optimal value $argmax_\pi \Phi$ of the objective function has a mean of about $.67 (\log n)^.82$ and a standard deviation approximately equal to $.84/\sqrt{n \log n}$, if $n \ge 10$ or so.

Is this known? Is there a theory that implies these observations and makes them more precise?

1 Answers 1

5

You're asking about the optimal value of the random assignment problem, which has received quite a lot of study in the past couple of decades. Most of the work has been with uniform$[0,1]$ or exponential$(1)$ random variables, though, and with minimums. Perhaps this is because the uniform and exponential cases have proved easiest to analyze. While I don't think there has been much explicit work on the case in which costs are $N(0,1)$, we can piece some of the results together to say that the expected value you're looking for is $E[\Phi_n] \sim A \sqrt{\log n},$ for some constant $A$.

Define $L_n = \min_{\pi \in S_n} \sum_i C_{i,\pi(i)}.$ Frenk, van Houweninge, and Rinnooy Kan ("Order statistics and the linear assignment problem," Computing 39 (2), 165-174, 1987) proved that under some mild conditions on the cumulative distribution function $F$ that, for some constant $C$ that depends on $F$, $E[L_n] \sim C n F^{-1}(1/n).$ Since the normal pdf is symmetric we can apply asymptotic estimates for $F^{-1}(1-1/n)$ (i.e., the maximum rather than the minimum) so that when costs are $N(0,1)$, for some constant $A$, $E[\Phi_n] \sim A \sqrt{\log n},$ which is not far from what your numerical evidence says. (If $C$ turns out to be $1$ in the normal case, then we would have $A = \sqrt{2}$.)


For the exponential$(1)$ distribution, Aldous ("The $\zeta(2)$ limit in the random assignment problem," Random Structures and Algorithms 18 (4), 381-418, 2001), proved that $\lim_{n \to \infty} E[L_n] = \zeta(2) = \frac{\pi^2}{6}.$ Aldous notes that his result holds for distributions with the same density at $0$ (assuming this exists and is positive), so we have the same limiting value if costs are uniform$[0,1]$. This was followed a few years later by independent work from Linusson and Wästlund ("A proof of Parisi's conjecture on the random assignment problem," Probability Theory and Related Fields 128 (3), 419-400, 2004) and Nair, Prabhakar, and Sharma ("Proofs of the Parisi and Coppersmith-Sorkin random assignment conjectures," Random Structures and Algorithms 27 (4), 413-444, 2005) proving that in the exponential$(1)$ case $E[L_n] = \sum_{i=1}^n \frac{1}{i^2}.$


For more, see the recent survey paper by Krokhmal and Pardalos ("Random assignment problems," European Journal of Operational Research 194, 1-17, 2009). There are some results given there on variances. For instance, in the exponential$(1)$ case, it appears that $\sigma^2(L_n) = \frac{4\zeta(2) - 4\zeta(3)}{n} + O(n^{-2}).$

  • 0
    thank you, that is exactly the kind of information that I had hoped for. The result by Frenk etc. is perhaps more relevant to my question, since the minimization problem for the case of distributions that are supported on $(0,\infty)$ does not have to deal with the effects of infinite tails.2011-05-06