1
$\begingroup$

I came through this equation and would like to learn more about how to inteprete it. d is defined as actual value, and $\hat{d}$ is defined as predicted value.

Why does this equation (a) divides sum of least square error to the sum of $d_{i,j}^2$ ? (b) takes square root of (a) ?

May I know what's the name of this equation, that I can read further...

$ \sqrt{\frac{\sum_{i,j} (d_{i,j} - \hat{d}_{i,j})^2}{\sum_{i,j} d_{i,j}^2}} $

  • 0
    Hi, it's from Section 4 of this paper http://user.informatik.uni-goettingen.de/~ychen/NC/DMF_Networking10.pdf2012-08-14

1 Answers 1

0

a) it is a normalization factor. Assume that you have perfect prediction. Then you will get $0$. Say your predictor is restricted to a closed interval $[0,1]$ and your estimates are all zeros and actual values are all ones. Then you will get $1$. b)square root is to make it as a distance in $N$ dimensional Euclidean space, such as $z=\sqrt{(x_1-y_1)^2+(x_2-y_2)^2+...}$