This question came up as a tangent to another project. I'd appreciate help defining the question more clearly as well as grappling with a solution:
Suppose I want to write an algorithm to generate a "realistic" starfield. That is, the distribution of stars and their magnitudes in my 2d sky should be consistent with the distribution that would result if I populated a large 3d space with stars and then projected those stars onto the sky, accounting for the drop in magnitude with distance.
Assumptions:
All positions of stars can be assumed to have integer coordinates.
The distribution of stars in the 3d space is such that any given integer position has a fixed probability of containing a star.
The distribution of the magnitudes of the stars is a normal distribution (?) Or is there another probability distribution that would be a better choice?
If it makes the calculations easier, the starfield being generated can be assumed to be a rectangle, instead of radiation lines of sight from an observer.
Feel free to ask questions about anything unclear here: my first problem has been trying to rigorously define the question.