University of Virginia Computer Science CS216: Program and Data Representation, Spring 2006 |
(none) 23 January 2006 |
O (big-Oh) — the set O(f) is the set of functions that grow no faster than f
f O(g)) means:Ω (Omega) — the set Ω (f) is the set of functions that grow at least as fast as f
There are positive constants c and n_{0} such that f(n) ≤ cg(n) for all values n ≥ n_{0}.
f Ω (g) means:Θ (Theta) — the set Θ (f) is the set of functions that grow as fast as f
There are positive constants c and n_{0} such that f(n) ≥ cg(n) for all values n ≥ n_{0}.
f Θ (g) meanso (little-oh) — the set o(f) is the set of functions that grow slower than f
f O(g)) and f Ω (g)
f o (g) means:
For all positive constants c, there exists a positive constant n_{0} such that f(n) ≤ cg(n) for all values n ≥ n_{0}.
People (and textbooks, but fortunately not the one we use in CS216) often use them less carefully and say things like, "linear search is O(n)". This is common, but poor wording. It is better to say, "the running time of linear search is in Θ(n) where n is the number of elements in the input list". Note that it is true to say "the running time of linear search is in O(n)", but this is a much weaker statement. It is also true that "the running time of linear search is in O(n^{2})". The higher O-bounds are all correct, just not as precise as possible. In cases where a Θ-bound can be found, this is always more useful.
Which is "bigger", O(n) or
O(n^{2})?
Which is "bigger", Ω(n) or
Ω(n^{2})?
Which is "bigger", Θ(n) or
Θ(n^{2})?
Draw a diagram showing the sets O(n),
Ω(n), Θ(n), and o(n) and
the functions 3n, log n, n^{2} as
points in your set diagram.
Prove that for all functions f, Θ(f) = O(f) - o(f). (Start by making an information argument why it should be true. Then, prove formally using the definitions of Θ, O, and o.
[an error occurred while processing this directive]