Asymptotic Analysis of Algorithms

Asymptotic Analysis is a way to evaluate the performance of an algorithm. It helps in analyzing how the time (or space) taken by an algorithm increases with the increase in the input size.

We ignore constant values in asymptotic analysis. Consider 2 algorithms ‘A’ and ‘B’, algorithm ‘A’ takes 100n time and algorithm ‘B’ takes 20n time to complete, n being the input size. However both these algorithms are consider as asymptotically same, since we ignore constants hence for both the algorithms the order of growth is ‘n’.

There are three different cases to analyze in an algorithm:

  1. Best Case: Minimum Time required by the algorithm to execute.
  2. Average Case: Average Time required by the algorithm to execute
  3. Worst Case: Maximum Time required by the algorithm to execute

Asymptotic Notations

Asymptotic Notations are mathematical tools to represent complexity of an algorithm. Following are different types of asymptotic notations:

(Big) Theta notation(Θ):

The Theta notation bounds a function form above and below.

Example: Consider an algorithm that takes linear time (n) in base case and quadratic time (n2) in worst case. Now to represent this in theta notation we say the below 2 statement:

  1. The worst case time complexity of the algorithm is Θ(n2).
  2. The best case time clexity of the algorithm is Θ(n).

Big O notation(O):

The Big O notation defines an upper bound of an algorithm.

Example: Consider the same algorithm that takes linear time (n) in base case and quadratic time (n2) in worst case as described in the previous example. To represent it in Big O notation we say:

The time complexity of the algorithm is O(n2).

(Big) Omega notation(Ω):

The Omega notation defines a lower bound of an algorithm.

Example: Consider the same algorithm described in the previous emaple. To represent it in Omega notation we say:

The time complexity of the algorithm is Ω(n).

Little O notation(o):

The Little O notation is similar to the Big O notation, while Big O notation defines an inclusive upper bound little O is a strict upper bound. (If we conside Big O as ≤, then little O is <).

Example: Consider an algorithm having a time complexity of O(n). Now lets say we come up with a solution that runs faster say O(N/logloglogN). Thus we can say the time complexity of the algorithm is o(N), it means that the algorithm is faster but how much faster is not clear.

Little Omega notation(Ω):

As little O notation defines strict upper bound, little omega defines a strict lower bound.


Analysing Programs

O(1)

Time complexity of a program is said to be O(1) if it doesn't contain any loop or recursion. Also note that if a loop runs for a constant number of times, its time complexity is also considered to be O(1).

O(n)

Time complexity of a program is said to be O(n) if the loop control variable is incremented/ decremented by a constant amount.

for(int i=0; i<n; i=i+c){
    ...
}
Here c is a constant.

O(nc)

Time complexity of a program is said to be O(nc), if the program has nested loops. Here c is a constant, which denotes the nesting level. Here times complexity is equal to the number of times the innermost statement is executed.

for(int i=0; i<n; i++){
    for(int j=0; j<n; j++){
      ...
} } Time complexity is the above program would be O(n2).

O(logN)

Time complexity of a program is said to be O(logN), if the loop control variable is multipled/ divided by a constant amount.

for(int i=0; i<n; i=i*c){  // (or i = i/c)
      ...
}
Here c is a constant.