Chapter 2 Algorithm Analysis
Topics:
*How to estimate the time required for a program.
*How to reduce the running time of a program from days or years to fractions of a second.
*The results of careless use of recursion.
*Very efficient algorithms to raise a number to a power an to compute the greatest common divisor of two numbers.
2.1 Mathematical Background
The definition is to compare the relative rates of growth.
Def1. the growth rate of T(N) <= f(N) O
Def2. the growth rate of T(N) >-=g(N) Ω
Def3. the growth rate of T(N) == h(N) Θ
Def3. the growth rate of T(N) < p(N) o
When we say that T(N) = O(f(N)), we are guaranteeing that the function T(N) grows at a rate no faster than f(N);
thus f(N) is an upper bound on T(N).
when f(N) = Ω(T(N)). T(N) is the lower bound on f(N).
example, N^3 grows faster than N^2, so N^2 = O(N^3) or N^3 = Ω(N^2)
example, f(N) = N^2 g(N) = 2N^2 grow at the same rate, so both f(N) = O(g(N)) and f(N) = Ω(g(N)) are true.
when two function grow at the same rate, then the decision of whether or not to signify this with Θ() is depend on the particular context.
another example, if g(N) = 2N^2, then we can say thate g(N) = O(N^4), g(N) = O(N^3), and g(N) = O(N^2).
but the last option is the best answer. Writing g(N) = Θ(N^2) says not only that g(N) = O(N^2) but also that the result is as good as possible.