Growth Rates

A difficult concept for many students to grasp is the idea of function growth rates, and in particular why we allow ourselves to "ignore constant factors" and "ignore low-order terms". Here's a copy of the table from p. 15 of Steven Skiena's Algorithm Design Manual, which lists the running time of algorithms with various growth rates, measured in nanoseconds (billionths of a second).
nf(n)=lg nf(n)=nf(n)=n lg nf(n)=n2f(n)=2nf(n)=n!
100.003 µs0.01 us0.033 us0.1 us1 us3.63 ms
200.004 µs0.02 us0.086 us0.4 us1 ms77.1 years
300.005 µs0.03 us0.147 us0.9 us1 sec8.4 * 1015 years
400.005 µs0.04 us0.213 us1.6 us18.3 minnever mind
500.006 µs0.05 us0.282 us2.5 us13 days  
1000.007 µs0.1 us0.644 us10 us4 * 1013 years  
10000.010 µs1.0 us9.966 us1 msnever mind  
100000.013 µs10 us130 us100 ms   
1000000.017 µs0.10 ms1.67 ms10 sec   
10000000.020 µs1 ms19.93 ms16.7 min   
100000000.023 µs0.01 sec0.23 sec1.16 days   
1000000000.027 µs0.10 sec2.66 sec115.7 days   
10000000000.030 µs1 sec29.90 sec31.7 years   

But let's look at it another way. Suppose we've decided how much time we have to solve the problem, and we want to know how large a problem we can solve in that time. In the following table, the left-hand column represents the available time, and the entries in the body of the table represent the largest n for which we can solve the problem in that time.
time f(n)=lg n f(n)=n f(n)=n lg n f(n)=n2 f(n)=2n f(n)=n!
1 µs 1.07 * 10301 1000 140 31 10 6
1 ms c. 10300000 1000000 62746 1000 20 9
1 sec never mind 109 c. 4 * 107 31623 30 13
1 min   6*1010 c. 2*109 244949 36 14

Now let's try that same table again, but we'll divide each function by 100, equivalent to using a 100-times-faster processor (and memory, and bus, and....)
time f(n)=(lg n)/100 f(n)=n/100 f(n)=(n lg n)/100 f(n)=n2/100 f(n)=2n/100 f(n)=n!/100
1 µs c. 1030000 100000 7741 316 16 8
1 ms c. 1030000000 108 4523071 10000 26 11
1 sec never mind 1011 c. 3*109 316227 36 14
1 min   6 * 1012 c. 1.6*1011 2449489 42 15
You should take two lessons from this table.

  1. Improving the algorithm (changing the growth rate of the function) makes a lot more difference than improving the processor (multiplying the function by a constant).
  2. If the algorithm is asymptotically inefficient, processor speed won't help you much;
    if the algorithm is asymptotically efficient, improving processor speed does help.

Last modified: Wed Dec 27 22:37:27 EST 2006
Stephen Bloch / sbloch@adelphi.edu