Algorithms / Asymptotic Analysis
Least You Need to Know: Big-O, Theta, and Omega
Asymptotic notation compares growth rates. **Big-O** gives an eventual upper bound, **Omega** gives a lower bound, and **Theta** means the growth is tightly sandwiched between both.
The least you need to know
- Ignore constant factors and lower-order terms when comparing asymptotic growth.
- Big-O is an eventual upper bound, not an exact runtime for every input size.
- Theta means both an upper and lower bound of the same rate.
- Common growth rates worth recognizing are `1`, `log n`, `n`, `n log n`, `n^2`, and `2^n`.
- Average-case hash-table lookup is often modeled as `O(1)` even though worst-case behavior can be worse.
Key notation
O(f(n))
eventual upper bound
Ω(f(n))
eventual lower bound
Θ(f(n))
tight bound
Tiny worked example
- For `3n^2 + 7n + 4`, the `n^2` term dominates for large `n`.
- So the function is `Θ(n^2)`.
- That automatically means it is also `O(n^2)` and `Ω(n^2)`.
Common mistakes
- Students often think Big-O means exact runtime rather than an upper bound class.
- Students often keep irrelevant constants and lower-order terms.
- Students often confuse worst-case, average-case, and asymptotic notation.
How to recognize this kind of problem
- Ask which term dominates when `n` becomes very large.
- If two bounds match up to constant factors, you usually want Theta.
- For data-structure questions, decide whether the question is about average-case or worst-case behavior.