## Asymptotic notations

### Topics Covered

- Definition of asymptotic notations
- Types of asymptotic notations
**Big-oh notation****Theta notation****Omega notation**

#### Definition:

- Asymptotic notations are the mathematical notations that are used to describe the time to run an algorithm when the input tends towards a particular value or a limiting value.
- for example: In bubble sort, when the input array is already sorted, the time taken by the algorithm is linear i.e. the best case.
- when the input array is in reverse condition, the algorithm takes the maximum time (quadratic )i.e. the worst case.
- when the input array is neither in reverse order nor sorted, then the time taken by the algorithm is the average time. These durations are denoted by the help of asymptotic notations.

- Asymptotic notations are divided into three parts mainly:
**Big-oh notation****Theta notation****Omega notation**

## Big-Oh notation

Big-Oh notation represents the upper bound of the running time of an algorithm. That’s why it gives the worst-case complexity of an algorithm.

f(n) = O(g(n)) if and only if there exist positive constants c and no such that f(n) ≤ cg(n) for all n, n ≥ n_{0}.

- f(n) =O(g(n)) {there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 }.
- The above expression can be described as a function f(n) belongs to the set O(g(n)) if there exists a positive constant c such that it lies between 0 and CG(n), for sufficiently large n.
- For any value of n, the running time of an algorithm does not cross the time provided by O(g(n)).
- It is widely used to analyze an algorithm where we are interested in the worst-case scenario.
- Examples:
- 3n+2=O(n) /* 3n+2≤4n for n≥2 */
- 3n+3=O(n) /* 3n+3≤4n for n≥3 */
- 100n+6=O(n) /* 100n+6≤101n for n≥10 */
- 10n2+4n+2=O(n2) /* 10n2+4n+2≤11n2 for n≥5 */
- 6*2n+n2=O(2n) /* 6*2n+n2 ≤7*2n for n≥4 */

##### Graphical Representation

**Theta Notation (Θ-notation)**

Theta notation encloses the function from above and below i.e. it represents the upper and lower bound of the runtime of an algorithm. It is used for analyzing the average case complexity of an algorithm.

f(n)=Θ(g(n)) {there exist positive constants c1, c2 and n0,such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 }

- The above expression can be described as a function f(n) belongs to the set Θ(g(n)) if there exist positive constants c1 and c2 such that it can be sandwiched between c1g(n) and c2g(n), for sufficiently large n.
- If a function f(n) lies anywhere in between c1g(n) and c2 > g(n) for all n ≥ n0, then f(n) is said to be asymptotically tight bound.
- Example:
- suppose that we calculate that a running time is 6n2 + 100n + 3006 microseconds. Or maybe it’s milliseconds. When we use big-Θ notation, you don’t say. You also drop the factor 6 and the low-order terms 100n + 300, and we just say that the running time is Θ(n2).

##### Graphical Representation

**Omega Notation (Ω-notation)**

It represents the lower bound of the running time of an algorithm. That’s why it provides best case complexity of an algorithm.

f(n)=Ω(g(n)) {there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }.

- The above expression can be described as a function f(n) belongs to the set Ω(g(n)) if there exists a positive constant c such that it lies above cg(n), for sufficiently large n.
- For any value of n, the minimum time required by the algorithm is given by Omega Ω(g(n))
- Example:
- F(n)=16 => 15 *1 < F(n) , c=15 and n0=0
- F(n)=3n+5 => 3n <=3n+5, c=3 for all n , F(n)=Ω(n)
- F(n)=27n2 + 16n =>, 27n2< 27n2 + 16n , c=27, for all n, F(n)=Ω(n2)
- F(n)=10n2 +7, => 10n2 < 10n2 +7, c=10, for all n, F(n)=Ω(n2)