Orders of Growth
What exactly is Big O/Ω/Θ 🤔
When we talk about orders of growth, we are usually dealing with a set of specific asymptotic notations. Because it is asymptotic, we are not interested in the precise calculation of how fast/efficient is a particular algorithm/function. Instead we are more interested in general behaviour of algorithms when its arguments grows towards a specific value or infinity.
Let’s take the function foo(x)
as a example. When analyzing the order of growth of foo
, we are essentially asking ourselves:
How will the run time of foo(x)
be affected with respect to x
when x
changes (e.g increase in size/length, increase in magntitude)?
To be more rigorous, the above question can then be split into 3 more specific questions:
What is a runtime with respect to x
that…
foo(x)
cannot never exceed?foo(x)
will never be able to beat (faster than)?foo(x)
is tightly bounded by?
These 3 questions in turn actually forms the general intuition behind the notations of Big O, Ω and Θ respectively.
Big O
What is the runtime with respect to
x
thatfoo(x)
cannot never exceed?
To put it simply, Big O represents a upper bound of the order of growth/time complexity of a function.
Suppose represents the function/algorithm and we say that has a Big O of , then it can be said that .. This means that the growth of the run time of is proportional to something (can’t find a good word) much slower growing than .
The (incomplete) math equivalent: where is a constant
Note that we can also let , and have more general problem statement: Find a such that
is also sometimes called the comparsion function
In fact there can be multiple valid comparsion functions that can be the Big O for a specific algorithm. Taking the example above, all functions that are Big O of is also the Big O of .
For example, , therefore is also accurate
Additionally, do note that is accurate as well! (this is related to the precise mathematical definition of Big O)
Big Ω
What is the runtime with respect to
x
thatfoo(x)
will never be able to beat (faster than)?
Big Ω in many ways is just the inverse of Big O, where we are looking at the lower bound of the order of growth/time complexity of a function.
Similarly, there can be multiple valid comparsion functions that can be the Big Ω for a specific algorithm.
For example, if , then
Same as above, is valid! (refer to the precise mathematical definition of Big Ω)
Big Θ
What is the runtime with respect to
x
thatfoo(x)
is tightly bounded by?
After looking at the upper and lower bound, Big Θ focuses on the “tight” bound. To understand Big Θ, we need to focus closely on the relationship between comparsion functions and the actual function that represents the algorithm.
Suppose we still have represent our function/algorithm, then finding the Big Θ is the same as finding what is a (a comparsion function) such that where and are constants.
Note that the on both sides represents the same comparison function
Therefore, if and , then it is unclear as to what the Big Θ of really is! On the other hand, if we have and , then we can say that .
Recurrence Relations
After talking about what these notations mean, what good is there if we do not have a way to calculate it?
Note that in the context of CS1101S, we are interested in the Big Θ / tight bound of a function. This is so that you can’t say the time complexity of a binary search = and expect it to be accepted. Nothing is wrong with that equality but it misses the point.
A good way to calculate the order of growth / time complexity is using the idea of recurrence relations. To put it in simple terms, it is the calculation of Total work done where Total work done = Work done now + Work done later.
Work done now often means what is amount of work, or more specifically how many operations will be executed in the current call of the function. Work done later is usually used to represent the number of operations for your recursive calls.
When we talk about the number of operations we are still measuring it with respect to the arguments of the function.
Examples
Length of a list
function length(lst) {
return is_null(lst)
? 0
: 1 + length(tail(lst));
}
Representing it in recurrence relation,
where is the length of the given list
Expanding it,
Therefore, we can conclude that time complexity of length
is .