A algorithm require one second to resolve a problem of size $1000$ a local machine.
How long time take the same algorithm to resolve the same problem for a problem size of $10.000$ if the algorithm require a proportional time to $n^2$?
I think that:
T(1000) = 1 second, but I don't know how to establish relationship with $n^2$
Thanks.