$xy = 2^nx_Ly_L + 2^{n/2}(x_Ly_R + x_Ry_L)+ x_Ry_R$ where $n$ is the size of the number (in bits) and $x_{L/R}$ and $y_{L/R}$ are the right and left bit halves of $x$ and $y$..
Initially, you have 4 multiplication problems:
- $x_Ly_L$
- $x_Ly_R$
- $x_Ry_L$
- $x_Ry_R$
Gaus's multiplication "simplifies" this to 3 problems, reducing computation time by 25%.
- $x_Ly_L$
- $x_Ry_R$
- $(x_L + x_R)(y_L + y_R)$
This is based on the fact that $(x_Ry_L + x_Ly_R) = (x_L + x_R)(y_L + y_R) - x_Ly_L - x_Ry_R$. But doesn't $(x_L + x_R)(y_L + y_R) = x_Ly_R + x_Ry_L + x_Ly_L + x_Ry_R$? Therefore doesn't the third multiplication still require $x_Ry_L$ and $x_Ly_R$? You could argue that instead of using the distributive property, you are adding the numbers in the bracket first, resulting in just one multiplication. However that multiplication problem would have twice the size. Therefore you would technically be reducing the problems to three, but the third one would have twice the size as the other two, resulting in it essentially taking the same amount of time. Evaluating $4$ problems of size $n/2$ problems is equal to evaluating 2 problems of size $n/2$ and one problem of size $n$ since $4(n/2) = 2(n/2) + n$. This leads to my question: How exactly does Gauss multiplication save time?