Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Big-O Notation Overview | Simple Algorithms
Sorting Algorithms
course content

Contenido del Curso

Sorting Algorithms

Sorting Algorithms

1. Simple Algorithms
2. Divide and Conquer Algorithms
3. Problems

Big-O Notation Overview

Each program does some amount of operations: printing, calculating, iterating, etc. Usually, this amount depends on the size of input data. For example, to traverse an array of size N, the program does N operations inside the loop and some constant amount of operations outside the loop (reading, printing etc.). In this case, say that the program has O(N) Time Complexity if amount of operations is up to c*N, where c is some constant value, c>0, and not inf. We say that number of operations mostly depends on N value.

Let's estimate the Time Complexity for the next pieces of code.

Example 1

12345
# calculate the number of positive values in list list_ = [2, 95, 0, -2, -2, 14, 4, -100, -89, 9] # 1 s = 0 # 1 for el in list_: # len(list_) if el > 0: s += 1 # 2
copy

Each line contains the number of operations per line in comments. We do constant number operation for objects initialization (it does not depend on length of list). It also known as O(1). But inside the loop, we do one or two operations per iteration (checking the condition and adding), so the total number of operations can be estimated as:

c * N + k

  • k operations for initializing.
  • c operations per one iteration.
  • N iterations in total - the length of the list.

For different input data, the N value is variable, meantime the c and k values are constant. Time complexity is O(N).

In other words, the size of input data can be very huge, and the number of operations depends on it in general.

Example 2

Let's estimate the Time Complexity for more complex exmaple:

123456789101112
# Calculate the sum of elements in columns n = 4 m = [[1, -2, 3, -4], [-4, 3, -2, 1], [5, 6, 7, 8], [1, 2, 3, 4]] s = [0] * n for i in range(n): for j in range(n): s[j] += m[i][j] print(s)
copy

Here we have:

  • Initalizing: constant time, or O(1).
  • Two nested loops, both of size n, and constant_time per iteration (addition): n * n * another_constant_time, or O(N^2).
  • Printing: n times for each element of s, or O(N).

Note that printing depends on n, too, it is not a constant time operation.

The total Time Complexity is O(N^2). You may ask, why not O(1) + O(N^2) + O(N)? That's because of the next rules.

Rules for calculating the Time Complexity

  • O(1) is some constant time that does not depend on N. It can be 1, 2, 10, or 100000, but constant for any N value.
  • c*O(N) = O(c*N) = O(N) - constant does not affect the total complexity (example: two or three separate loops are still O(N)).
  • greater Big-O shadows the less Big-O. Since N^2 operations are N times greater than N operations, the total complexity is estimated by greater Big-O which is N^2.

It means that O(N^2) + O(N) = O(N^2), and that's why in Example 2 we lost the smaller Big-O.

The order of complexities:

O(1) < O(logN)

  • O(N) * O(M) = O(NM), if N and M are independent variables (for example, you do N operations in the inner loop, and then call this loop M times in the outer loop).

Example 3

If you are familiar with Binary Search algorithm, let's estimate its Time Complexity. The array divides into two parts k times until it's possible, and do some constant amount of operations. Then k = logN, so Time Complexity is O(logN).

In this course, the Time Complexity will be explained for each algorithm.

About Space Complexity

The Space Complexity describes how many additional memory to allocate for the algorithm. It estimates the same way as Time Complexity.

For example, the Space Complexity for Example 1 is O(1) (constant), for Example 2 - O(N) (list s of length N).

Tarea

Read the given code and estimate its Time Complexity. Then print it as a string value. (For example, if it is O(N^4), print the string 'O(N^4)').

Let's say that append operation takes O(1) and number of words is constant: strings list contains 4 words of different length.

Avoid using extra spaces and constants: print O(N^4), not O( N^4) or O(N ^ 4) etc.

Tarea

Read the given code and estimate its Time Complexity. Then print it as a string value. (For example, if it is O(N^4), print the string 'O(N^4)').

Let's say that append operation takes O(1) and number of words is constant: strings list contains 4 words of different length.

Avoid using extra spaces and constants: print O(N^4), not O( N^4) or O(N ^ 4) etc.

Cambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones

¿Todo estuvo claro?

Sección 1. Capítulo 2
toggle bottom row

Big-O Notation Overview

Each program does some amount of operations: printing, calculating, iterating, etc. Usually, this amount depends on the size of input data. For example, to traverse an array of size N, the program does N operations inside the loop and some constant amount of operations outside the loop (reading, printing etc.). In this case, say that the program has O(N) Time Complexity if amount of operations is up to c*N, where c is some constant value, c>0, and not inf. We say that number of operations mostly depends on N value.

Let's estimate the Time Complexity for the next pieces of code.

Example 1

12345
# calculate the number of positive values in list list_ = [2, 95, 0, -2, -2, 14, 4, -100, -89, 9] # 1 s = 0 # 1 for el in list_: # len(list_) if el > 0: s += 1 # 2
copy

Each line contains the number of operations per line in comments. We do constant number operation for objects initialization (it does not depend on length of list). It also known as O(1). But inside the loop, we do one or two operations per iteration (checking the condition and adding), so the total number of operations can be estimated as:

c * N + k

  • k operations for initializing.
  • c operations per one iteration.
  • N iterations in total - the length of the list.

For different input data, the N value is variable, meantime the c and k values are constant. Time complexity is O(N).

In other words, the size of input data can be very huge, and the number of operations depends on it in general.

Example 2

Let's estimate the Time Complexity for more complex exmaple:

123456789101112
# Calculate the sum of elements in columns n = 4 m = [[1, -2, 3, -4], [-4, 3, -2, 1], [5, 6, 7, 8], [1, 2, 3, 4]] s = [0] * n for i in range(n): for j in range(n): s[j] += m[i][j] print(s)
copy

Here we have:

  • Initalizing: constant time, or O(1).
  • Two nested loops, both of size n, and constant_time per iteration (addition): n * n * another_constant_time, or O(N^2).
  • Printing: n times for each element of s, or O(N).

Note that printing depends on n, too, it is not a constant time operation.

The total Time Complexity is O(N^2). You may ask, why not O(1) + O(N^2) + O(N)? That's because of the next rules.

Rules for calculating the Time Complexity

  • O(1) is some constant time that does not depend on N. It can be 1, 2, 10, or 100000, but constant for any N value.
  • c*O(N) = O(c*N) = O(N) - constant does not affect the total complexity (example: two or three separate loops are still O(N)).
  • greater Big-O shadows the less Big-O. Since N^2 operations are N times greater than N operations, the total complexity is estimated by greater Big-O which is N^2.

It means that O(N^2) + O(N) = O(N^2), and that's why in Example 2 we lost the smaller Big-O.

The order of complexities:

O(1) < O(logN)

  • O(N) * O(M) = O(NM), if N and M are independent variables (for example, you do N operations in the inner loop, and then call this loop M times in the outer loop).

Example 3

If you are familiar with Binary Search algorithm, let's estimate its Time Complexity. The array divides into two parts k times until it's possible, and do some constant amount of operations. Then k = logN, so Time Complexity is O(logN).

In this course, the Time Complexity will be explained for each algorithm.

About Space Complexity

The Space Complexity describes how many additional memory to allocate for the algorithm. It estimates the same way as Time Complexity.

For example, the Space Complexity for Example 1 is O(1) (constant), for Example 2 - O(N) (list s of length N).

Tarea

Read the given code and estimate its Time Complexity. Then print it as a string value. (For example, if it is O(N^4), print the string 'O(N^4)').

Let's say that append operation takes O(1) and number of words is constant: strings list contains 4 words of different length.

Avoid using extra spaces and constants: print O(N^4), not O( N^4) or O(N ^ 4) etc.

Tarea

Read the given code and estimate its Time Complexity. Then print it as a string value. (For example, if it is O(N^4), print the string 'O(N^4)').

Let's say that append operation takes O(1) and number of words is constant: strings list contains 4 words of different length.

Avoid using extra spaces and constants: print O(N^4), not O( N^4) or O(N ^ 4) etc.

Cambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones

¿Todo estuvo claro?

Sección 1. Capítulo 2
toggle bottom row

Big-O Notation Overview

Each program does some amount of operations: printing, calculating, iterating, etc. Usually, this amount depends on the size of input data. For example, to traverse an array of size N, the program does N operations inside the loop and some constant amount of operations outside the loop (reading, printing etc.). In this case, say that the program has O(N) Time Complexity if amount of operations is up to c*N, where c is some constant value, c>0, and not inf. We say that number of operations mostly depends on N value.

Let's estimate the Time Complexity for the next pieces of code.

Example 1

12345
# calculate the number of positive values in list list_ = [2, 95, 0, -2, -2, 14, 4, -100, -89, 9] # 1 s = 0 # 1 for el in list_: # len(list_) if el > 0: s += 1 # 2
copy

Each line contains the number of operations per line in comments. We do constant number operation for objects initialization (it does not depend on length of list). It also known as O(1). But inside the loop, we do one or two operations per iteration (checking the condition and adding), so the total number of operations can be estimated as:

c * N + k

  • k operations for initializing.
  • c operations per one iteration.
  • N iterations in total - the length of the list.

For different input data, the N value is variable, meantime the c and k values are constant. Time complexity is O(N).

In other words, the size of input data can be very huge, and the number of operations depends on it in general.

Example 2

Let's estimate the Time Complexity for more complex exmaple:

123456789101112
# Calculate the sum of elements in columns n = 4 m = [[1, -2, 3, -4], [-4, 3, -2, 1], [5, 6, 7, 8], [1, 2, 3, 4]] s = [0] * n for i in range(n): for j in range(n): s[j] += m[i][j] print(s)
copy

Here we have:

  • Initalizing: constant time, or O(1).
  • Two nested loops, both of size n, and constant_time per iteration (addition): n * n * another_constant_time, or O(N^2).
  • Printing: n times for each element of s, or O(N).

Note that printing depends on n, too, it is not a constant time operation.

The total Time Complexity is O(N^2). You may ask, why not O(1) + O(N^2) + O(N)? That's because of the next rules.

Rules for calculating the Time Complexity

  • O(1) is some constant time that does not depend on N. It can be 1, 2, 10, or 100000, but constant for any N value.
  • c*O(N) = O(c*N) = O(N) - constant does not affect the total complexity (example: two or three separate loops are still O(N)).
  • greater Big-O shadows the less Big-O. Since N^2 operations are N times greater than N operations, the total complexity is estimated by greater Big-O which is N^2.

It means that O(N^2) + O(N) = O(N^2), and that's why in Example 2 we lost the smaller Big-O.

The order of complexities:

O(1) < O(logN)

  • O(N) * O(M) = O(NM), if N and M are independent variables (for example, you do N operations in the inner loop, and then call this loop M times in the outer loop).

Example 3

If you are familiar with Binary Search algorithm, let's estimate its Time Complexity. The array divides into two parts k times until it's possible, and do some constant amount of operations. Then k = logN, so Time Complexity is O(logN).

In this course, the Time Complexity will be explained for each algorithm.

About Space Complexity

The Space Complexity describes how many additional memory to allocate for the algorithm. It estimates the same way as Time Complexity.

For example, the Space Complexity for Example 1 is O(1) (constant), for Example 2 - O(N) (list s of length N).

Tarea

Read the given code and estimate its Time Complexity. Then print it as a string value. (For example, if it is O(N^4), print the string 'O(N^4)').

Let's say that append operation takes O(1) and number of words is constant: strings list contains 4 words of different length.

Avoid using extra spaces and constants: print O(N^4), not O( N^4) or O(N ^ 4) etc.

Tarea

Read the given code and estimate its Time Complexity. Then print it as a string value. (For example, if it is O(N^4), print the string 'O(N^4)').

Let's say that append operation takes O(1) and number of words is constant: strings list contains 4 words of different length.

Avoid using extra spaces and constants: print O(N^4), not O( N^4) or O(N ^ 4) etc.

Cambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones

¿Todo estuvo claro?

Each program does some amount of operations: printing, calculating, iterating, etc. Usually, this amount depends on the size of input data. For example, to traverse an array of size N, the program does N operations inside the loop and some constant amount of operations outside the loop (reading, printing etc.). In this case, say that the program has O(N) Time Complexity if amount of operations is up to c*N, where c is some constant value, c>0, and not inf. We say that number of operations mostly depends on N value.

Let's estimate the Time Complexity for the next pieces of code.

Example 1

12345
# calculate the number of positive values in list list_ = [2, 95, 0, -2, -2, 14, 4, -100, -89, 9] # 1 s = 0 # 1 for el in list_: # len(list_) if el > 0: s += 1 # 2
copy

Each line contains the number of operations per line in comments. We do constant number operation for objects initialization (it does not depend on length of list). It also known as O(1). But inside the loop, we do one or two operations per iteration (checking the condition and adding), so the total number of operations can be estimated as:

c * N + k

  • k operations for initializing.
  • c operations per one iteration.
  • N iterations in total - the length of the list.

For different input data, the N value is variable, meantime the c and k values are constant. Time complexity is O(N).

In other words, the size of input data can be very huge, and the number of operations depends on it in general.

Example 2

Let's estimate the Time Complexity for more complex exmaple:

123456789101112
# Calculate the sum of elements in columns n = 4 m = [[1, -2, 3, -4], [-4, 3, -2, 1], [5, 6, 7, 8], [1, 2, 3, 4]] s = [0] * n for i in range(n): for j in range(n): s[j] += m[i][j] print(s)
copy

Here we have:

  • Initalizing: constant time, or O(1).
  • Two nested loops, both of size n, and constant_time per iteration (addition): n * n * another_constant_time, or O(N^2).
  • Printing: n times for each element of s, or O(N).

Note that printing depends on n, too, it is not a constant time operation.

The total Time Complexity is O(N^2). You may ask, why not O(1) + O(N^2) + O(N)? That's because of the next rules.

Rules for calculating the Time Complexity

  • O(1) is some constant time that does not depend on N. It can be 1, 2, 10, or 100000, but constant for any N value.
  • c*O(N) = O(c*N) = O(N) - constant does not affect the total complexity (example: two or three separate loops are still O(N)).
  • greater Big-O shadows the less Big-O. Since N^2 operations are N times greater than N operations, the total complexity is estimated by greater Big-O which is N^2.

It means that O(N^2) + O(N) = O(N^2), and that's why in Example 2 we lost the smaller Big-O.

The order of complexities:

O(1) < O(logN)

  • O(N) * O(M) = O(NM), if N and M are independent variables (for example, you do N operations in the inner loop, and then call this loop M times in the outer loop).

Example 3

If you are familiar with Binary Search algorithm, let's estimate its Time Complexity. The array divides into two parts k times until it's possible, and do some constant amount of operations. Then k = logN, so Time Complexity is O(logN).

In this course, the Time Complexity will be explained for each algorithm.

About Space Complexity

The Space Complexity describes how many additional memory to allocate for the algorithm. It estimates the same way as Time Complexity.

For example, the Space Complexity for Example 1 is O(1) (constant), for Example 2 - O(N) (list s of length N).

Tarea

Read the given code and estimate its Time Complexity. Then print it as a string value. (For example, if it is O(N^4), print the string 'O(N^4)').

Let's say that append operation takes O(1) and number of words is constant: strings list contains 4 words of different length.

Avoid using extra spaces and constants: print O(N^4), not O( N^4) or O(N ^ 4) etc.

Cambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones
Sección 1. Capítulo 2
Cambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones
We're sorry to hear that something went wrong. What happened?
some-alt