The graph we discuss here is a directed pseudo-graph (two vertices can have multiple edges) with self-loops. My question is above a proof of the following equation 1.
The background
Given a set of $n$ vertices $V$, suppose we know there are $m$ edges, and the edges are generated in the following fashion: for the $i$th edge, randomly choose one ${\cal U}_i∈V$ and then randomly choose one ${\cal V}_i∈V$ where ${\cal U}_i,{\cal V}_i$ are random variables and $i=1,…,m$. Since it is pseudo-graph with self-loops, then each choice of vertex can be assumed to have no impact on future choices, i.e. all choices of vertices are independent, then the generation of these edges are also independent.
The problem
Now randomly draw an edge $({\cal U},{\cal V})$ from the generated graph. We clearly have
$\mathbb{P}\left( {\mathcal{U} = u,\mathcal{V} = v} \right) = \frac{1}{{n^2}}$
because the edge $(u,v)$ is the result of two independent draws from vertices during graph generation.
The question is about the same probability given the observation of the edge set $E$. Since it is pseudograph, it is better to view the "edge set" as a $E(u,v):V^2 \to \Bbb N$ function indicating the number of edges between a pair of vertices $u,v$. It is very intuitive that
$\mathbb{P}\left( {\mathcal{U} = u,\mathcal{V} = v|E} \right) = \frac{{E\left( {u,v} \right)}}{{m}}$ ...... 1
I try to mathematically verify 1. By Bayesian's formula, we have
$\frac{{\mathbb{P}\left( {\mathcal{U} = u,\mathcal{V} = v,E} \right)}}{{\mathbb{P}\left( E \right)}} = \mathbb{P}\left( {\mathcal{U} = u,\mathcal{V} = v|E} \right)$
where it is easy to show
$\mathbb{P}\left( E \right) = \mathbb{P}\left( {{\mathcal{U}_1} = {u_1},{\mathcal{V}_1} = {v_1}, \ldots ,{\mathcal{U}_{m}} = {u_{m}},{\mathcal{V}_{m}} = {v_{m}}} \right) = \prod\limits_{i = 1}^m \mathbb{P} ({\mathcal{U}_i} = {u_i})\mathbb{P}({\mathcal{V}_i} = {v_i}) = {\left( {\frac{1}{{{n^2}}}} \right)^m} = \frac{1}{{{n^{2m}}}}$
If equation 1 actually holds, then we must have
$\mathbb{P}\left( {\mathcal{U} = u,\mathcal{V} = v,E} \right) = {\frac{{E(u,v)}}{{m{n^{2m}}}}}$
and (${E:|E| = m}$ denotes summing over all possible edge set that contains $m$ edges)
$\begin{gathered} \frac{1}{{{n^2}}} = \mathbb{P}\left( {\mathcal{U} = u,\mathcal{V} = v} \right) = \sum\limits_{E:|E| = m} {\mathbb{P}\left( {\mathcal{U} = u,\mathcal{V} = v,E} \right)} \hfill \\ = \sum\limits_{E:|E| = m} {\frac{{E(u,v)}}{{m{n^{2m}}}}} = \frac{1}{{m{n^{2m}}}}\sum\limits_{E:|E| = m} {E(u,v)} \hfill \\ \Rightarrow \sum\limits_{E:|E| = m} {E(u,v)} = m{n^{2m - 2}} \hfill \\ \end{gathered} $
So can anyone help verify my above reasoning is good and help prove the last equation $\sum\limits_{E:|E| = m} {E(u,v)} = m{n^{2m - 2}}$ is true? If so then I think we proved equation 1. Many thanks!