http://www.math.yorku.ca/~hkj/Teaching/481/Assign/pp7.pdf
I'm stuck with #1 on this file, the only thing I can talk about is the central limit theorem and/or normality, with the bootstrapping example, but I don't think my answer will suffice.
http://www.math.yorku.ca/~hkj/Teaching/481/Assign/pp7.pdf
I'm stuck with #1 on this file, the only thing I can talk about is the central limit theorem and/or normality, with the bootstrapping example, but I don't think my answer will suffice.
The question could be worded better regarding the bootstrap. You do not generate b random samples from the Poisson to get bootstrap sample. What you do is take the original sample of size 23 and select a bootstrap sample of size 23 by randomly selecting indices 1 to 23 and picking the observation with that number and doing this process 23 times. This is random sampling with replacement from the original sample. The bootstrap sample will generally have some observed values appear 2 or 3 times in the sample and some not appearing at all in the sample.
You repeat this procedure to generate more bootstrap samples and get a total of b bootstrap samples. The samples will vary because the bootstrap samples will have different observations repeating and not appearing. For each bootstrap sample you get an mle for lambda. The set of b bootstrap sample will give you b estimates of lambda.
The bootstrap principle says that the bootstrap distribution of the estimate of lambda will approximate the true distribution of the estimate. This approximation gets better as n gets large. Letting b tend to infinity allows this "Monte Carlo" approximation of the bootstrap distribution for the estimate to approach the true bootstrap distribution of the estimate. From this bootstrap distribution you can compute its standard deviation which is the bootstrap approximation to the standard error of the estimate for lambda.
Generally speaking the bootstrap estimate of standard error can be shown to approach the true population standard error as the sample size n increases. This is also true for the estimate of standard error that you have from knowing the data come from a Poisson distribution. The convergence will therefore occur as n gets large. The author of the question incorrectly states that this convergence will work for fixed n as bgoes to infinity. If the two estimates turn out to be close it will be because 23 is large enough for that to happen. But the convergence he asks you to demonstrate requires n to go to infinity as well!