next up previous contents
Next: Sparse Arrays Up: Weiss Chapter 2 Notes Previous: Finding the Big-O of

Big-O Notation and Algorithm Analysis

Now that we have seen the basics of big-O notation, it is time to relate this to the analysis of algorithms.

In our study of algorithms, nearly every function whose order we are interested in finding is a function that defines the quantity of some resource consumed by a particular algorithm in relationship to the parameters of the algorithm. (This function is often referred to as a complexity of the algorithm, or less frequently as the cost function of the algorithm.) This function is usually not the same as the algorithm itself, but is a property of the algorithm. For example, when we are analyzing an algorithm that multiplies two numbers, the functions we might be interested in are the relationships between the number of digits in each number and the length of time or amount of memory required by the algorithm.

Although big-O notation is a way of describing the order of a function, it is also often meant to represent the time complexity of an algorithm. This is sloppy use of the mathematics, but unfortunately not uncommon. (For example, at the bottom of page 22 in Weiss, a factorial function is described as being O(N). Clearly, the factorial function itself is not O(N), it is O(N!) according to its definition. What is meant is that the particular algorithm given for computing the factorial of N requires O(N) time to execute.) Usually it is clear from the surrounding context what the big-O refers to, but when it is not clear, please ask.

Note that this notation, like the orders themselves, doesn't tell us how quickly or slowly the algorithms actually execute for a given input. This information can be extremely important in practice- during the semester, we will study several algorithms that address the same problems and have the same order running time, but take substantially different amounts of time to execute.

Similarly, the order doesn't tell us how fast an algorithm will run for a small N. This may also be quite important in practice- during the semester, we will study at least one group of algorithms where the choice of the best algorithm depends on N- when N is small, one algorithm is best, but when N is large, a different algorithm is much better.

The Size of a Problem

Big-O notation is used to express an ordering property among functions. In the context of our study of algorithms, the functions are the amount of resources consumed by an algorithm when the algorithm is executed. This function is usually denoted T(N). T(N) is the amount of the resource (usually time or the count of some specific operation) consumed when the input to the algorithm is of size N.

Sometimes it is not obvious what the ``size'' of the input is, so here are few examples:

What Does It All Mean?

The big-O complexity of an algorithm is important, but it does not tell the entire story. Any honest and complete analysis of an algorithm should attempt to address the question of what the real cost of executing the algorithm is for real or typical problems.

For example, imagine that (through some some wonderful twist of fate), you must choose between job offers from two companies. The first company offers a contract that will double your salary every five years. The second company offers you a contract that gives you a raise of $1000 per year. Your salary at the first company increases at tex2html_wrap_inline4078, while your salary at the second company increases at a rate of O(n). Which position would you choose?

From this information, it is impossible to say. If your starting salaries are the same at the two companies, then what the starting salary is will make all the difference (and if the first company gives you a starting salary of $0, then you are really in trouble!). If the starting salary at the second company is much higher than the first company, then even the wonderful raises the first company offers might not actually make any difference before you retire. In this case, n isn't going to get too large (assuming an ordinary career), so the behavior of your salary as you become infinitely old is not an issue.

next up previous contents
Next: Sparse Arrays Up: Weiss Chapter 2 Notes Previous: Finding the Big-O of

Dan Ellard
Mon Jul 21 22:30:59 EDT 1997