If I multiply the averages of two sets of data, then what can I conclude with the deviation of that data?
That is, for a given average X1 and X2, both are different sets of data but related, I calculated a derived average X3 = X1 * X2. Both X1 and X2 also has their own standard deviation values, D1 and D2. How can I calculate the standard deviation value D3?
My concrete sample problem to illustrate the use-case
I have sample performance measurements of a computer operation, measured in miliseconds / invocation
(the first data set) and invocations / day
(the second data set). Both of these has a mean and a standard deviation value. I would like to get another measure that is in miliseconds / day
. So I multiplied the first data with the second :
X1 * X2 = X3 miliseconds / invocation * invocations / day = miliseconds / day
The question is, how do I get the standard deviation value of X3 ?
Thanks in advance.