algodeck

Complexity

Big-O Cheat Sheet

0/1 Knapsack brute force complexity

Time complexity: O(2^n) with n the number of items

Space complexity: O(n)

#complexity

0/1 Knapsack memoization complexity

Time and space complexity: O(n * c) with n the number items and c the capacity

#complexity

0/1 Knapsack tabulation complexity

Time and space complexity: O(n * c) with n the number of items and c the capacity

Space complexity could even be improved to O(2*c) = O(c) as we need to store only the last 2 lines (using row%2):

int[][] dp = new int[2][c + 1];

#complexity

Amortized complexity definition

How much of a resource (time or memory) it takes to execute per operation on average

#complexity

Array complexity: access, search, insert, delete

Access: O(1)

Search: O(n)

Insert: O(n)

Delete: O(n)

#array #complexity

B-tree complexity: access, insert, delete

All: O(log n)

#complexity #tree

BFS and DFS graph traversal time and space complexity

Time: O(v + e) with v the number of vertices and e the number of edges

Space: O(v)

#complexity #graph

BFS and DFS tree traversal time and space complexity

BFS: time O(v), space O(v)

DFS: time O(v), space O(h) (height of the tree)

#complexity #tree

Big O

Upper bound

#complexity

Big Omega

Lower bound (fastest)

#complexity

Big Theta

Theta(n) if both O(n) and Omega(n)

#complexity

Binary heap (min-heap or max-heap) complexity: insert, get min (max), delete min (max)

Insert: O(log (n))

Get min (max): O(1)

Delete min: O(log n)

#complexity #heap

BST complexity: access, insert, delete

If not balanced O(n)

If balanced O(log n)

#complexity #tree

BST delete algo and complexity

Find inorder successor and swap it

Average: O(log n)

Worst: O(h) if not self-balanced BST, otherwise O(log n)

#complexity #tree

Bubble sort complexity and stability

Time: O(n²)

Space: O(1)

Stable

#complexity #sort

Complexity of a function making multiple recursive subcalls

Time: O(branches^depth) with branches the number of times each recursive call branches (english: 2 power 3)

Space: O(depth) to store the call stack

#complexity

Complexity to create a trie

Time and space: O(n * l) with n the number of words and l the longest word length

#complexity

Complexity to insert a key in a trie

Time: O(k) with k the size of the key

Space: O(1) iterative, O(k) recursive

#complexity #tree

Complexity to search for a key in a trie

Time: O(k) with k the size of the key

Space: O(1) iterative or O(k) recursive

#complexity #tree

Counting sort complexity, stability, use case

Time complexity: O(n + k) // n is the number of elements, k is the range (the maximum element)

Space complexity: O(k)

Stable

Use case: known and small range of possible integers

#complexity #sort

Doubly linked list complexity: access, insert, delete

Access: O(n)

Insert: O(1)

Delete: O(1)

#complexity #linkedlist

Hash table complexity: search, insert, delete

All: amortized O(1), worst O(n)

#complexity #hashtable

Heapsort complexity, stability, use case

Time: Theta(n log n)

Space: O(1)

Unstable

Use case: space constrained environment with O(n log n) time guarantee

Yet, not stable and not cache friendly

#complexity #sort

Insertion sort complexity, stability, use case

Time: O(n²)

Space: O(1)

Stable

Use case: partially sorted structure

#complexity #sort

Linked list complexity: access, insert, delete

Access: O(n)

Insert: O(1)

Delete: O(1)

#complexity #linkedlist

Mergesort complexity, stability, use case

Time: Theta(n log n)

Space: O(n)

Stable

Use case: good worst case time complexity and stable, good with linked list

#complexity #sort

Quicksort complexity, stability, use case

Time: best and average O(n log n), worst O(n²) if the array is already sorted in ascending or descending order

Space: O(log n) // In-place sorting algorithm

Not stable

Use case: in practice, quicksort is often faster than merge sort due to better locality (not applicable with linked list so in this case we prefer mergesort)

#complexity #sort

Radix sort complexity, stability, use case

Time complexity: O(nk) // n is the number of elements, k is the maximum number of digits for a number

Space complexity: O(k)

Stable

Use case: if k < log(n) (for example 1M of elements from 0..1000 as 4 < log(1M))

#complexity #sort

Recursivity impacts on algorithm complexity

Space impact as each call is added to the call stack

Unless we use tail call recursion

#complexity

Red-black tree complexity: access, insert, delete

All: O(log n)

#complexity #tree

Selection sort complexity

Time: Theta(n²)

Space: O(1)

#complexity #sort

Stack implementations and insert/delete complexity

Insert: O(1)

Delete: O(1)

Insert: O(n), amortized time O(1)

Delete: O(1)

#complexity #stack

Time complexity to build a binary heap

O(n)

#complexity #heap

Topological sort complexity

Time and space: O(v + e)

#complexity #graph