I am new, and wanted to see if someone can help me.
What is the running time of your algorithm (below) with respect to the variable $n$? Give an upper bound of the form ${\cal O}(f(n))$ and a lower bound of the form $\Omega(g(n)).$ Can you characterize the asymptotic running time by by some $\Theta(h(n))$?
The algorithm:
Input: $A[1 \ldots n]$ array of floating point numbers
Output: Two dimensional array $M,$ such that $M[i][j]$ is the average of $A[i],\ldots, A[j],$ for all $i \le j,$ 0 for all $i>j.$
for i := 1 to n do
for j := 1 to n do
if i > j then M[i][j] := 0
else
sum := 0;
d := j-i+1;
l := i;
r := j;
do
sum := sum + a[l];
l++;
while (l <= r)
M[i][j] := sum / d;
print M[i][j];
end for
end for
Can someone give me an upper and lower bound?
I guess the algorithm complexity is ${\cal O}(n^3)$ on average, because of the quadratic complexity of the 2 for loops and the inner while loop which takes $O(n)$ time, which has the overhand over the assignments operations which takes $O(1)$ time.
But what is the upper and lower bound? And asymptotic running time by by some $\Theta(h(n))$?