Big O Notation Part 2 - Time Complexity

Jreyes
3 min readJul 11, 2021

--

in my previews post, we went over Big o Notation and I provided a basic example of how this works and I provided an example for the constant time. Now we are going to review more about time complexity.

What is time complexity?

Time complexity is the amount of time taken by an algorithm to run, as a function of the length of the input. It measures the time taken to execute each statement of code in an algorithm. The way we can get the time complexity is by counting the number of operations performed by your code. This time complexity is defined as a function of the input size “N” using Big-O nitation. The “N” is indicating the input size, while the O is the worst-case scenario growth rate function.

Lets remember the formulas by name and examples:

O(1) - Constant:

O(1) describes algorithms that take the same amount of time to compute regardless of the input size.

For instance, if a function takes the same time to process ten elements and 1 million items, then we say that it has a constant growth rate or O(1)

To explain the constant time lets work with the example of finding if a number is odd or even.

function isEvenOrOdd(n) {
return n % 2 ? 'Odd' : 'Even';
}

console.log(isEvenOrOdd(10)); // => Even
console.log(isEvenOrOdd(10001)); // => Odd

O(n) - Linear time:

The linear running time algorithms are widespread. These algortithms imply that the program will visit every element from the input. O(n) means that the algorithms take proportionally longer to comeplete as the input grows.

To explain the linear time we will work with the example of finding the largest item on an unsorted array.

function findMax(n) {let max;
let counter = 0;
for (let i = 0; i < n.length; i++) {counter++;if(max === undefined || max < n[i]) {max = n[i];}}console.log( `n: ${n.lengt}, counter: ${counter}`)return max;}findMax([100, 345, 9898]) // => n: 3, counter: 3
9898

The findMax function checks every element from the “n” to find the largest item. I added a counter to count how manu time my block of code is executed.

Time complexity analysis: Inside the function we have 2 operations “ max and counter” a loop size of n and inside this loop we have 3 operations. This get us 3(n) + 2.

O(n²) - Quadratic time:

A function with a quadratic time complexity has a growth rate of n² . If the input is size 2, it will do 4 operations. If the input is size 8, it will take 64, and so on.

To explain the quadratic time lest work with verifying if an array has a duplicate value.

function hasDuplicates(n) {const duplicates = [];let counter = 0; for (let outter = 0; outter < n.length; outter++) {for (let inner = 0; inner < n.length; inner++) {counter++; if(outter === inner) continue;if(n[outter] === n[inner]) {return true;}}}console.log(`n: ${n.length}, counter: ${counter}`);return false;}hasDuplicates([1,2,1,3,1,5]) => true

Time complexity analysis: Inside the function hasDuplicates() we have 2 operations “duplicates and counter”, we have a double loop of size n, so n², and 3 operations insude the double loop. This gets us 3n² + 2. We have an asymptotic analysis, we have to drop all constants and leave the most critical term; n². In the Big I notation, it would be O(n²).

This is all for today. I’ll see you in part 3.

--

--

Jreyes
Jreyes

Written by Jreyes

I'm a software engineer making his way into the tech industry.

No responses yet