O Notation

The big O notation is the usual way of measuring how fast or slow an algorithm is. You've probably seen it before, especially in editorials of NOI.PH problems. But what is it really, and how does it work?

This page contains a simplified description of O notation.

This document contains a much better definition of O notation but it might be a bit technical.

Read both. The first link will get you a good understanding of why O notation is used so much in computer science.

One thing I would like to point out is that things that discuss big O often abuse notation. When we say $O(g(x))$, we actually mean a set of functions related to $g(x)$. It’s the set of functions that $g(x)$ is "greater than or equal to", up to constant factors. So writing $x^2 + x = O(x^2)$ is equating a function with a set of functions, which kind of makes no sense. What people really mean is that $x^2 + x$ is an element of the set $O(x^2)$.

Exercises