# Asymptotic Analysis: Big-O Notation and More

In this tutorial, you will learn what asymptotic notations are. Likewise, you will learn Big-O notation, Theta notation, and Omega notation.

When it comes to analysing the complexity of any algorithm as far as reality, we can never give a definite number to characterize the time required and the space needed by the algorithm, rather we express it using some standard notations, otherwise called Asymptotic Notations.

At the point when we analyse any algorithm, we for the most part get a formula to represent the measure of time needed for execution or the time needed by the computer to run the lines of code of the algorithm, the number of memory gets to, the number of comparisons, impermanent variables occupying memory space and so forth This formula frequently contains insignificant details that don’t actually tell us anything about the running time.

The proficiency of an algorithm relies upon the measure of time, storage, and different assets needed to execute the algorithm. The proficiency is measured with the assistance of asymptotic notations.

An algorithm might not have similar execution for various sorts of data inputs With the increase in the input size, the presentation will change.

The study of progress in execution of the algorithm with the adjustment in the request for the input size is characterized as an asymptotic analysis.

## Asymptotic Notations

Asymptotic notations are the mathematical notations used to depict the running time of an algorithm when the input tends towards a specific worth or a limiting value.

For instance: In bubble sort, when the input array is arranged, the time taken by the algorithm is direct for example the best case.

Be that as it may, when the input array is the backward condition, the algorithm takes the maximum time (quadratic) to sort the components for example the best case.

At the point when the input array is neither arranged nor in the converse request, at that point it requires some time. These lengths are indicated using asymptotic notations.

There are for the most part three asymptotic notations:

• Big O notation
• Omega notation
• Theta notation

## Big-O Notation (O-notation)

Big-O notation represents the upper bound of the running time of an algorithm. In this way, it gives the worst-case complexity of an algorithm.

```O(g(n)) = { f(n): there exist positive constants c and n0
such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 }```

The above expression can be depicted as a function f(n) has a place with the set O(g(n)) if there exists a positive steady c with the end goal that it lies among 0 and cg(n), for adequately huge n.

For any value of n, the running season of an algorithm doesn’t cross the time given by O(g(n)).

Since it gives the worst-case running season of an algorithm,

## Omega Notation (Ω-documentation)

Omega notation represents the lower bound of the running season of an algorithm. Consequently, it gives the best case complexity of an algorithm.

```Ω(g(n)) = { f(n): there exist positive constants c and n0
such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }```

The above expression can be portrayed as a function f(n) has a place with the set Ω(g(n)) if there exists a positive steady c with the end goal that it lies above cg(n), for adequately huge n.

For any value of n, the base time needed by the algorithm is given by Omega Ω(g(n))

## Theta Notation (Θ-notation)

Theta notation encases the function from above and beneath. Since it addresses the upper and the lower bound of the running season of an algorithm, it is used for analyzing the normal case complexity of an algorithm.

For a function g(n), Θ(g(n)) is given by the relation:

```Θ(g(n)) = { f(n): there exist positive constants c1, c2 and n0
such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 }```

The above expression can be depicted as a function f(n) has a place with the set Θ(g(n)) if there exist positive constants c1 and c2 with the end goal that it very well may be sandwiched somewhere in the range of c1g(n) and c2g(n), for adequately huge n.

On the off chance that a function f(n) lies anyplace in the middle c1g(n) and c2g(n) for all n ≥ n0, at that point f(n) is supposed to be asymptotically tight bound.

Thanks for reading! We hope you found this tutorial helpful and we would love to hear your feedback in the Comments section below. And show us what you’ve learned by sharing your photos and creative projects with us.