Big O notation
In computational complexity theory, big O notation is often used to describe how the size of the input data affects an algorithm's usage of computational resources (usually running time or memory). It is also called Big Oh notation, Landau notation, Bachmann-Landau notation, and asymptotic notation. Big O notation is also used in many other scientific and mathematical fields to provide similar estimations.
The symbol O is used to describe an asymptotic upper bound for the magnitude of a function in terms of another, usually simpler, function. There are also other symbols o, Ω, ω, and Θ for various other upper, lower, and tight bounds. Informally, the O notation is commonly employed to describe an asymptotic tight bound, but tight bounds are more formally and precisely denoted by the Θ (capital theta) symbol as described below. This distinction between upper and tight bounds is useful, and sometimes critical; most computer scientists would urge distinguishing the usage of O and Θ. In some other fields, however, the Θ notation is not commonly known.
Reference:
http://en.wikipedia.org/wiki/Big_o_notation