Asymptotic Notation
Asymptotic notation describes the running time of an algorithm through functions that
the set of natural numbers ℕ = {0, 1, 2,...}
as their domain.
This notation is convenient for determining worst-case run time scenarios T(n)
.
However in some cases asymptotic notation can extend the domain to the set of real numbers or restrict it to
a subset of the natural numbers.
The amount of time it takes for code to run varies not only between devices, but also between runs. This is due to the fact that time is affected by factors such as hardware, programming language, number of applications open, and so on. As a result, rather than measuring the time it takes a program to run, it is preferable to count the number of operations and see how it changes as input size increases.
Big Oh (O) Notation
O(n) is an upper bound asymptotic notation.
O(g(n)) = { f(n): there exists positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 }
Example: 2n2 = O(n3), where c = 1 and n0 = 2
Examples of O(n2): n2, n2 + 10n, n
Big Omega (Ω) Notation
Ω(n) is a lower bound asymptotic notation.
Ω(g(n)) = { f(n): there exists positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }
Example: √n = Ω(lg n), where c = 1 and n0 = 16
Examples of Ω(n2): n2, n2 + 10n, 2n
Big Theta (Θ) Notation
Θ(n) is an exact value or useful range that narrows the performance of an algorithm.
Θ(g(n)) = { f(n): there exists positive constants c1, c2, and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 }
Example: n2 / 2 - 2n = Θ(n2), where c1 = 1/4, c2 = 1/2, and n0 = 8.
Theorem: f(n) = Θ(g(n)) if and only if f = O(g(n)) and f = Ω(g(n)).
Functions
Check out this Big Oh Cheat Sheet for an overview of big-oh notation for various data structures and functions.
Function | Notation |
---|---|
Constant | 1 |
Logarithmic | logn |
Square Root | √n |
Linear | n |
Log-Linear | nlogn |
Quadratic | n2 |
Cubic | n3 |
Polynomial | nk |
Exponential | 2n |
Factorial | n! |