Asymptotic Notations

Asymptotic Notations

When it comes to analyzing the complexity of any algorithm in terms of time and space, we can never provide an exact number to define the time required and the space required by the algorithm, instead we express it using some standard notations, also known as Asymptotic Notations.

When we analyze any algorithm, we generally get a formula to represent the amount of time required for execution or the time required by the computer to run the lines of code of the algorithm, number of memory accesses, number of comparisons, temporary variables occupying memory space etc. This formula often contains unimportant details that don't really tell us anything about the running time.
 

Types of Asymptotic Notations

We use three types of asymptotic notations to represent the growth of any algorithm, as input increases:
  1. Theta (Θ)
  2. Big Oh(O)
  3. Omega (Ω) 

Theta Notation, θ

The notation θ(n) is the formal way to express both the lower bound and the upper bound of an algorithm's running time.

 

Big Oh Notation, Ο

The notation Ο(n) is the formal way to express the upper bound of an algorithm's running time. It measures the worst case time complexity or the longest amount of time an algorithm can possibly take to complete.

Omega Notation, Ω

The notation Ω(n) is the formal way to express the lower bound of an algorithm's running time. It measures the best case time complexity or the best amount of time an algorithm can possibly take to complete.


Common Asymptotic Notations

Following is a list of some common asymptotic notations −

constant Ο(1)
logarithmic Ο(log n)
linear Ο(n)
n log n Ο(n log n)
quadratic Ο(n2)
cubic Ο(n3)
polynomial nΟ(1)
exponential 2Ο(n)
 

Sunday 17 February 2019

Asymptotic Notations

Asymptotic Notations

When it comes to analyzing the complexity of any algorithm in terms of time and space, we can never provide an exact number to define the time required and the space required by the algorithm, instead we express it using some standard notations, also known as Asymptotic Notations.

When we analyze any algorithm, we generally get a formula to represent the amount of time required for execution or the time required by the computer to run the lines of code of the algorithm, number of memory accesses, number of comparisons, temporary variables occupying memory space etc. This formula often contains unimportant details that don't really tell us anything about the running time.
 

Types of Asymptotic Notations

We use three types of asymptotic notations to represent the growth of any algorithm, as input increases:
  1. Theta (Θ)
  2. Big Oh(O)
  3. Omega (Ω) 

Theta Notation, θ

The notation θ(n) is the formal way to express both the lower bound and the upper bound of an algorithm's running time.

 

Big Oh Notation, Ο

The notation Ο(n) is the formal way to express the upper bound of an algorithm's running time. It measures the worst case time complexity or the longest amount of time an algorithm can possibly take to complete.

Omega Notation, Ω

The notation Ω(n) is the formal way to express the lower bound of an algorithm's running time. It measures the best case time complexity or the best amount of time an algorithm can possibly take to complete.


Common Asymptotic Notations

Following is a list of some common asymptotic notations −

constant Ο(1)
logarithmic Ο(log n)
linear Ο(n)
n log n Ο(n log n)
quadratic Ο(n2)
cubic Ο(n3)
polynomial nΟ(1)
exponential 2Ο(n)