I'm writing a software, where I need an event to occur on average 10 times per 24 hours.
I want to implement that by checking a random number against preset probability every second.
I've tried $10^{1/(3600\cdot 24)}-1$ as that probability, but that is incorrect. I also can calculate that probability using simulation, but that seems lame (as there may be delays in execution, and I want to be able to seamlessly adjust the interval and always get the correct result).
Can anybody help me with the formula and the rationale behind it?