A constant k needs to be calculated including its gaussian error. $k = f_{(u,t)}$
$k_i$ can be calculated with the values and errors of $u_i$ and $t_i$ and their respective errors.
Main issue is that i can not calculate the mean value $\bar{k}$ of $k_i$'s without loosing error information. There is some error attached to using $f_{(u,t)}$ the values $k_i$ appears not to be constant although they should be - a function $g_{(u,t)}$ wich does not have that problem is not available.
If i just calculate $\bar{k}$ and transport the error normally using gaussian error translation, my error explodes beyond reason, multiple measurements should make my error smaller.
If i use the standard deviation error, the error information of individual $k_i$ values is lost and my error is so small that i cant sleep at night.
I dont even know where to start on solving this problem.