I'm quite confused about something.
*So I have an algorithm which takes as input $k!$ numbers, let's call them $x_1, x_2, \ldots, x_{k!}$.
*Then, in the algorithm, a 'matrix' is defined: i.e. for each $i$ and $j$ ($i$ from $1$ to $k-1$, $j$ from $1$ to $k$) a number R_{i,j} is defined by adding $(k-1)!$ numbers (each of which is in $\{x_1, x_2, \ldots, x_{k!}$, but this is not relevant for all I know) together.
*Then a similar construction is done: a new 'matrix', as above, is defined, but now each of the $k(k-1)$ elements (of the matrix) are obtained by adding $k$ numbers.
*Finally, $k-1$ elements are defined, each of which is obtained by adding up $k-1$ numbers, each of which is a multiplication of precisely two numbers. (So it is like this: for example the first of these k-1 elements is $a_1 = t_1 l_1 + t_2 l_2 + \cdots + t_{k-1} l_{k-1}$.)
That is basically my algorithm.
So what's the complexity of it? Do I have to count both the sums as the multiplications? I added up (k-1)! in the algorithm, so I guess it will be $O(k!)$!? I was said that 'only multiplications' need to be counted. Also, do I have to "count" assignments?
So, in general, how do you compute the computational complexity of an algorithm such as this one; it should be fairly easy, as only multiplications and sums are done...