5. Big-O notation (Upper Bound – Worst Case) time n n 0 f(n) c . g(n) f(n) = O(g(n))
6.
7.
8.
9.
10.
11.
12. -notation (Lower Bound – Best Case) time n n 0 c . g(n) f(n) f(n) = (g(n))
13.
14.
15.
16.
17. notation (Theta) (Tight Bound) time n n 0 c 1 . g(n) f(n) f(n) = (g(n)) c 2 . g(n)
18.
Notas del editor
There are actually 5 kinds of asymptotic notation. How many of you are familiar with all of these? What these symbols do is give us a notation for talking about how fast a function goes to infinity, which is just what we want to know when we study the running times of algorithms. Instead of working out a complicated formula for the exact running time, we can just say that the running time is theta of n^2. That is, the running time is proportional to n^2 plus lower order terms. For most purposes, that’s just what we want to know. One thing to keep in mind is that we’re working with functions defined on the natural numbers. Sometimes I’ll talk (a little) about doing calculus on these functions, but the piont is that we won’t care what, say, f(1/2) is.
Sometimes we won’t know the exact order of growth. Sometimes the running time depends on the input, or we might be talking about a number of different algorithms. Then we might want to put an upper or lower bound on the order of growth. That’s what big-O and big-Omega are for. Except for Theta, the thing to remember is that the English letters are upper bounds, and the Greek letters are lower bounds. (Theta is both, but it’s only a greek letter.) So O(g(n)) is the set of functions that go to infinity no faster than g. The formal definition is the same as for Theta, except that there is only one c, and you have the inequality. We call g an asymptotic . . .
In the same way, Omega(g(n)) is the set of functions that go to infinity no slower than g(n). Again, the definition is the same except that the inequality reads “0 le c g(n) le f(n)” for all n ge n0. Are there any questions?