Time complexity is the important term in sense of how many steps has to be taken in order to complete given algorithm. In order to describe complexity, big O notation is used. For example, linear complexity is expressed in big O notation as O(N). This means that in the case of doubling amount of the input data, time necessary for processing this data is also doubling. In programming, complexity of O(N) corresponds to single for loop. Complexity of given algorithm can be equal to O(N), can be higher and can be lower. Example of smaller complexity is complexity O(log(N)). Example for smaller complexity is logarithmic complexity O(log(N)). Algorithm that has logarithmic complexity is binary search algorithm. In this algorithm, in the next search step, range of data to be searched is reduced on half. The smallest complexity is the constant complexity O(1). Perhaps best example of constant complexity is algorithm for the sum of the first n natural numbers. For example, sum of first 100 numbers is 5050. In order to calculate this sum, we can make one for loop, and after 100 steps we can have sum calculated. That would be linear complexity. But we can also apply Gauss method for summing first N integers. Carl Friedrich Gauss invented this method when he was only 10 years old. Method goes as follows. We can sum the largest and smallest number in range, in this example 1 + 100 and obtain 101. Then we sum second smallest t and second largest number i.e. 2 + 99 and obtain 101 again. In the range form 1 to 100 there are 50 such pairs. Overall sum is then 50Â·101=5050. If we use this approach for arbitrary range of N numbers, overall sum is

independently of N. It means that we can calculate sum of N integers in one step only, no matter how many numbers are there. What is interesting is that N-th member Fibonacci sequence can be found in one step as well, according to formula:

A little bit more complex then linear complexity is complexity of O(NÂ·log(N)). Algorithms of O(NÂ·log(N)) complexity are merge sort and heap sort. Expressed in number of for loops, complexity O(NÂ·log(N)) is obtained when we have one for loop nested into another, where outer loop index starts from 0 and goes up to N-1, while index in inner loop is constantly decreasing, for example

for(int i=0; i<N-1;i++)

{

for(int j = 0; j < N – **i**; j++) {…}

}

Next higher complexity is O(N^2) and corresponds to two for loops nested, where both indexes take value from 0 to N-1. Algorithm with O(N^2) complexity is bubble sort. There also algorithms with exponential complexity, where with only small increase in input data size, time time usage for solving algorithm goes exponentially higher.

**External links:**

Time complexity and big O notation on bigocheatsheet

Time complexity and big O notation on Wikipedia

Time complexity and big O notation on Wisc.edu

## Leave a Reply