# Asymptotic notations

I’ve recently started reading «Introduction to Algorithms» by Thomas H. Cormen, where the author talks in detail about the algorithms and their analysis. Therefore, today I would like to share my small discovery and tell you about asymptotic notation.

⠀

What does asymptotic mean? 🤔

Asymptotic is the nature of the function change with its argument tending to infinity. Actual asymptotic notation applies to functions, and since we can express the running time of an algorithm in terms of a function, we can also represent it in the form of an asymptotic notation.

There are several basic notations:

– Θ (Theta) notation

– O (Big-O) notation

– Ω (Big-Omega) notation

– o (Small-o) notation

– ω (Small-omega) notation

📍 Let’s start with the **Θ notation**. This notation is an asymptotically accurate function estimate. It denotes a set of functions that satisfy the condition in Fig. 1.

For example, we have an algorithm that always works for 1/2n² — 3n, where n is the number of elements. We can say that the algorithm’s running time is Θ(n²).

But why? 🤔

⠀

Let’s prove it by substituting it into the formula: c1n² <= 1/2n² — 3n <= c2n², for all n >= n0.

Divide all parts of the inequality by n²: c1 <= 1/2–3/n <= c2.

The right inequality will be true for any n >= 1, with c2 >= 1/2; and the left inequality is true for any at n >= 7 and c2 <= 1/14.

Thus, there are values for which the inequality is true: c1 = 1/14, c2 = 1/2, n0 = 7, therefore 1/2n² — 3n ∈ Θ(n²).

⠀

🔁 And vice versa, we prove that 6n³ ∉ Θ(n²). For the function to belong to Θ(n²), it must be 6n³ <= c2n².

However, if we divide by n²: 6n <= c2, since c2 is a constant and n can increase infinitely, hence the inequality is unsatisfiable, and 6n³ ∉ Θ(n²).

Okay, so we have proven that 1/2n² — 3n = Θ(n²), but why exactly Θ(n²), and not Θ(1/2n² — 3n)? 🤔

📌 NOTE: In the asymptotic estimation of functions, **low-order terms can be neglected**, since for large n they become insignificant and only change the coefficients c1, c2, and n0.

Let’s move on! 🤘🏼 **О notation** defines only asymptotically the upper bound of the function (Fig. 3). This notation is used most often, because we are usually more interested in how long the algorithm takes in the worst case.

For example, 2n² = O(n²). Let’s consider another example: 2n = O(n²), of course this statement is true, however, the idea of an asymptotic estimate of a function may be wrong.

In such cases, it is better to use the o notation. The difference between the two is that O notation includes an upper bound, but o notation does not. Thus: 2n = o(n²), but 2n² ≠ о(n²) 🤯

⠀

And the last. **Ω notation** — defines the lower bound of a function only asymptotically (Fig. 4). It is used to show the running time of the algorithm at its best. Like both O notation and o notation, Ω notation includes a lower bound, but ω notation does not.

📌 Now that we know all the notations, I want to go back to Θ notation and say that we can indicate that a function belongs to Θ notation only when the function belongs to the same O and Ω notation. For example, if some algorithm executes something for f(n) of time and f(n) = O(n²) and f(n) = Ω(n²), then f(n) = Θ(n²).

⠀

Did you know what actually lies behind the asymptotic notation?