Several methods may be used to compare the performance of algorithms. One way is simply to

run several tests for each algorithm and compare the timings. Another way is to estimate the

time required. For example, we may state that search time is ** O**(

*n*) (big-oh of

*n*). This means that

search time, for large *n*, is proportional to the number of items *n *in the list. Consequently, we

would expect search time to triple if our list increased in size by a factor of three. The big-*O*

notation does not describe the exact time that an algorithm takes, but only indicates an upper

bound on execution time within a constant factor. If an algorithm takes ** O**(

*n*2) time, then

execution time grows no worse than the square of the size of the list.

- 7 -

n lg n n lg n n1.25 n 2

1 0 0 1 1

16 4 64 32 256

256 8 2,048 1,024 65,536

4,096 12 49,152 32,768 16,777,216

65,536 16 1,048,565 1,048,476 4,294,967,296

1,048,476 20 20,969,520 33,554,432 1,099,301,922,576

16,775,616 24 402,614,784 1,073,613,825 281,421,292,179,456

**Table 1-1: **Growth Rates

Table 1-1 illustrates growth rates for various functions. A growth rate of ** O**(lg

*n*) occurs for

algorithms similar to the binary search. The lg (logarithm, base 2) function increases by one

when *n *is doubled. Recall that we can search twice as many items with one more comparison in

the binary search. Thus the binary search is a ** O**(lg

*n*) algorithm.

If the values in Table 1-1 represented microseconds, then a ** O**(lg

*n*) algorithm may take 20

microseconds to process 1,048,476 items, a ** O**(

*n*1.25) algorithm might take 33 seconds, and a

** O**(

*n*2) algorithm might take up to 12 days! In the following chapters a timing estimate for each

algorithm, using big-** O **notation, will be included. For a more formal derivation of these

formulas you may wish to consult the references.

## No comments:

Post a Comment