Common Running Times
- Rank asymptotic complexities from smallest to largest.
Here are the common running times in the order of increasing growth rate.
- $\Omicron(1)$ — Constant functions
- Running time does not depend on the input size.
- It does not mean it only takes one step!
- $\Omicron(\log n)$ — Logarithmic functions
- Also called sub-linear functions.
- Runtime grows slower than the size of the problem.
- Example: Binary Search.
- $\Omicron(n)$ — Linear functions
- Running time is proportional to input size.
- Example: Linear search.
- $\Omicron(n\log n)$ — Superlinear functions
- Also called Linearithmic functions
- Arises in divide-and-conquer algorithms.
- Example: Merge sort and heapsort (we'll cover later).
- $\Omicron(n^2)$ — Quadratic functions
- Running time is proportional to the square of the input size.
- Example: Bubble/Insertion/Selection sort (we'll cover later).
- $\Omicron(n^k)$ — Polynomial functions
- $k$ is a constant $>1$
- $\Omicron(n^2)$ is a special case of this
- Example: Enumerate all subsets of $k$ nodes among $n$ nodes.
- $\Omicron(c^n)$ — Exponential functions
- $c$ is a constant $>1$
- Example: Enumerating all subsets of $n$ items.
- $\Omicron(n!)$ — Factorial functions
- Example: Generating all permutations or orderings of $n$ items.
All you need to appreciate is the order of increasing growth rate:
$$ n! \gg 2^n \gg n^3 \gg n^2 \gg n\log n \gg n \gg \log n \gg 1 $$
There are other (more esoteric) functions that arise in the study of algorithms. However, the functions above account for describing the runtime of basic algorithms (those used in this course and those you may come by during a coding interview).
The following illustration is taken from bigocheatsheet.com:
Resources
This article contains an explanation of Big-Oh notation, common running times, and a good-looking plot of those runtimes!