Let's say there is a 10% chance of an event taking place per 100 samples. If you increase the number of samples to infinity, theoretically the chance of the event occurring would converge 100%. But imagine this:
If you treat infinity as an infinite amount of sets of 100, would each of those sets still have only 10% chance of the event taking place?