Suppose there is a $2\times2$ matrix $O$ of observed values: $O = \begin{bmatrix}o_{11}&o_{12}\\o_{21}&o_{22}\end{bmatrix}$ and two matrices $E_1$ and $E_2$ of expected values: $E_1 = \begin{bmatrix}e'_{11}&e'_{12}\\e'_{21}&e'_{22}\end{bmatrix}\text{ and } E_2 = \begin{bmatrix}e''_{11}&e''_{12}\\e''_{21}&e''_{22}\end{bmatrix}.$
The total misallocation supposing $O$ is distributed according to $E_1$ is $L_1=\frac{1}{2}\left(\left|o_{11} - e'_{11}\right| + \left|o_{12} - e'_{12}\right| + \left|o_{21} - e'_{21}\right| + \left|o_{22} - e'_{22}\right|\right)$ and the total misallocation supposing O is distributed according to $E_2$ is $L_2=\frac{1}{2}\left(\left|o_{11} - e''_{11}\right| + \left|o_{12} - e''_{12}\right| + \left|o_{21} - e''_{21}\right| + \left|o_{22} - e''_{22}\right|\right).$
My question is, how can I measure how much better $E_1$ or $E_2$ is at representing $O$. Initially, I thought I could use a Chi-Squared or F distribution type test, but I don't know the distributions of $L_1$ and $L_2$.
Any help would be appreciated.