Complexities

http://bigocheatsheet.com/

GoodFairPoor

Searching

AlgorithmData StructureTime ComplexitySpace Complexity
AverageWorstWorst
Depth First Search (DFS)Graph of |V| vertices and |E| edges-O(|E| + |V|)O(|V|)
Breadth First Search (BFS)Graph of |V| vertices and |E| edges-O(|E| + |V|)O(|V|)
Binary searchSorted array of n elementsO(log(n))O(log(n))O(1)
Linear (Brute Force)ArrayO(n)O(n)O(1)
Shortest path by Dijkstra,
using a Min-heap as priority queue
Graph with |V| vertices and |E| edgesO((|V| + |E|) log |V|)O((|V| + |E|) log |V|)O(|V|)
Shortest path by Dijkstra,
using an unsorted array as priority queue
Graph with |V| vertices and |E| edgesO(|V|^2)O(|V|^2)O(|V|)
Shortest path by Bellman-FordGraph with |V| vertices and |E| edgesO(|V||E|)O(|V||E|)O(|V|)

Sorting

AlgorithmData StructureTime ComplexityWorst Case Auxiliary Space Complexity
BestAverageWorstWorst
QuicksortArrayO(n log(n))O(n log(n))O(n^2)O(n)
MergesortArrayO(n log(n))O(n log(n))O(n log(n))O(n)
HeapsortArrayO(n log(n))O(n log(n))O(n log(n))O(1)
Bubble SortArrayO(n)O(n^2)O(n^2)O(1)
Insertion SortArrayO(n)O(n^2)O(n^2)O(1)
Select SortArrayO(n^2)O(n^2)O(n^2)O(1)
Bucket SortArrayO(n+k)O(n+k)O(n^2)O(nk)
Radix SortArrayO(nk)O(nk)O(nk)O(n+k)

Data Structures

Data StructureTime ComplexitySpace Complexity
AverageWorstWorst
IndexingSearchInsertionDeletionIndexingSearchInsertionDeletion
Basic ArrayO(1)O(n)--O(1)O(n)--O(n)
Dynamic ArrayO(1)O(n)O(n)O(n)O(1)O(n)O(n)O(n)O(n)
Singly-Linked ListO(n)O(n)O(1)O(1)O(n)O(n)O(1)O(1)O(n)
Doubly-Linked ListO(n)O(n)O(1)O(1)O(n)O(n)O(1)O(1)O(n)
Skip ListO(log(n))O(log(n))O(log(n))O(log(n))O(n)O(n)O(n)O(n)O(n log(n))
Hash Table-O(1)O(1)O(1)-O(n)O(n)O(n)O(n)
Binary Search TreeO(log(n))O(log(n))O(log(n))O(log(n))O(n)O(n)O(n)O(n)O(n)
Cartresian Tree-O(log(n))O(log(n))O(log(n))-O(n)O(n)O(n)O(n)
B-TreeO(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(n)
Red-Black TreeO(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(n)
Splay Tree-O(log(n))O(log(n))O(log(n))-O(log(n))O(log(n))O(log(n))O(n)
AVL TreeO(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(n)

Heaps

HeapsTime Complexity
HeapifyFind MaxExtract MaxIncrease KeyInsertDeleteMerge
Linked List (sorted)-O(1)O(1)O(n)O(n)O(1)O(m+n)
Linked List (unsorted)-O(n)O(n)O(1)O(1)O(1)O(1)
Binary HeapO(n)O(1)O(log(n))O(log(n))O(log(n))O(log(n))O(m+n)
Binomial Heap-O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))
Fibonacci Heap-O(1)O(log(n))*O(1)*O(1)O(log(n))*O(1)

Graphs

Node / Edge ManagementStorageAdd VertexAdd EdgeRemove VertexRemove EdgeQuery
Adjacency listO(|V|+|E|)O(1)O(1)O(|V| + |E|)O(|E|)O(|V|)
Incidence listO(|V|+|E|)O(1)O(1)O(|E|)O(|E|)O(|E|)
Adjacency matrixO(|V|^2)O(|V|^2)O(1)O(|V|^2)O(1)O(1)
Incidence matrixO(|V| ⋅ |E|)O(|V| ⋅ |E|)O(|V| ⋅ |E|)O(|V| ⋅ |E|)O(|V| ⋅ |E|)O(|E|)

Notation for asymptotic growth

letterboundgrowth
(theta) Θupper and lower, tight[1]equal[2]
(big-oh) Oupper, tightness unknownless than or equal[3]
(small-oh) oupper, not tightless than
(big omega) Ωlower, tightness unknowngreater than or equal
(small omega) ωlower, not tightgreater than
[1] Big O is the upper bound, while Omega is the lower bound. Theta requires both Big O and Omega, so that's why it's referred to as a tight bound (it must be both the upper and lower bound). For example, an algorithm taking Omega(n log n) takes at least n log n time but has no upper limit. An algorithm taking Theta(n log n) is far preferential since it takes AT LEAST n log n (Omega n log n) and NO MORE THAN n log n (Big O n log n).SO
[2] f(x)=Θ(g(n)) means f (the running time of the algorithm) grows exactly like g when n (input size) gets larger. In other words, the growth rate of f(x) is asymptotically proportional to g(n).
[3] Same thing. Here the growth rate is no faster than g(n). big-oh is the most useful because represents the worst-case behavior.
In short, if algorithm is __ then its performance is __
algorithmperformance
o(n)< n
O(n)≤ n
Θ(n)= n
Ω(n)≥ n
ω(n)> n

Big-O Complexity Chart

Big O Complexity Graph

No comments:

Post a Comment