I have a report that I have generated on some data. The requirements of this report in particular is that the values for every row of data should be rounded to the nearest dollar. My boss is checking this report against another similar report to see if the numbers match up, but the requirements of that report was that it round to the nearest penny. Everyone understands that the numbers will be different to a degree.
Is there a way to mathematically calculate the range that we can expect the numbers to differ (+/-). I dont want to leave it up to just what "feels about right" numbers can