This may be a 'drill' problem from a statistics textbook. If $\sigma^2 = 1/10,$ and the data can be assumed to be normal, then a 95% confidence interval (CI) for the true mean period is $\bar X \pm 1.96\sigma/\sqrt{n}.$
The quantity $1.96\sigma/\sqrt{n}$ (half the width of the CI) is called the 'margin of error'. Maybe your problem wants the margin of error to be 1/100. If so, solve
$1.96\sigma/\sqrt{n} = 1/100$ for $n.$
Disclaimer: I am only suggesting this because my guess above is a typical kind of problem in statistics
courses when CIs are under discussion. However, typically the problems are
much more clearly stated than this one, so there is no way for me to know
if this is actually what is required.