In Paul Erdős and Rényi's 1959 paper On Random Graphs I, they describe the number of edges in a random graph by the function
(1) Nc = [1/2 * nlogn + cn]
where n is the number of nodes in the graph, c is "an arbitrary fixed real number" and [x] denotes the integer part of x. They go on to use a number of graphs of the form G(n, Nc), where G(n, Nc) denotes a random graph with n nodes and Nc edges.
Nc appears to be an arbitrary (?) function to return a number of edges based on some c and n, but I don't understand how c is selected or why the function is relevant. When the authors discuss graphs of the type G(n, Nc), is there any special property besides the graph just having some arbitrary number of edges? I.e, could I replace Nc with some function for a random number of edges between 0 and the maximum possible edges for the graph with the same results?