0
$\begingroup$

For the following algorithm.

  1. Initialize i = 1.
  2. If i > a then stop. Otherwise, if i divides a then output i.
  3. Increment i by 1, and then go to line 2.

If it takes one unit of time to execute each of the lines 2 and 3 in the above algorithm, how many time units are needed to run the algorithm on inputs of length t (lengths of numbers 1, 10 and 100 are 1, 2, and 3 in that order)? Also, I was wondering how efficient this algorithm is in terms of time.

1 Answers 1