In a least square regression algorithm, I have to do the following operations to compute regression coefficients:
- Matrix multiplication, complexity: $O(C^2N)$
- Matrix inversion, complexity: $O(C^3)$
- Matrix multiplication, complexity: $O(C^2N)$
- Matrix multiplication, complexity: $O(CN)$
where, N are the training examples and C is total number of features/variables.
How can I determine the overall computational complexity of this algorithm?
EDIT: I studied least square regression from the book Introduction to Data Mining by Pang Ning Tan. The explanation about linear least square regression is available in the appendix, where a solution by the use of normal equation is provided (something of the form $a=(X^TX)^{-1}X^Ty)$, which involves 3 matrix multiplications and 1 matrix inversion).
My goal is to determine the overall computational complexity of the algorithm. Above, I have listed the 4 operations needed to compute the regression coefficients with their own complexity. Based on this information, can we determine the overall complexity of the algorithm?
Thanks!