- Published on
Big O in Practice
- Authors

- Name
- Prince Neres
- @princeneres
What it is
Big O describes how an algorithm's runtime or memory usage grows as the input size n increases. It focuses on asymptotic behavior - the growth trend when n gets very large - ignoring small details like constants, and typically considering the worst case.
Example: an algorithm with cost 3n² + 5n + 10 is classified as O(n²).
Why it matters
- Predict scalability before shipping to production.
- Compare solutions without needing benchmarks.
- Foundation for architecture and design decisions.
Related notations
- O (Big O) - upper bound (worst case).
- Ω (Big Omega) - lower bound (best case).
- Θ (Big Theta) - tight bound (when best and worst cases match).
In practice, "Big O" is used as a generic synonym for "complexity", even when it would technically be Θ.
Time Complexity
Growth order, from fastest to slowest:
O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(n³) < O(2ⁿ) < O(n!)
O(1) - Constant
Runtime doesn't depend on input size.
function getFirst(arr) {
return arr[0];
}
Real-world examples: array access by index, push/pop on a stack, get/put on a HashMap (amortized), reading a variable.
O(log n) - Logarithmic
At each iteration, the problem is reduced by a fixed fraction (usually by half).
function binarySearch(arr, target) {
let left = 0;
let right = arr.length - 1;
while (left <= right) {
const mid = Math.floor((left + right) / 2);
if (arr[mid] === target) return mid;
if (arr[mid] < target) left = mid + 1;
else right = mid - 1;
}
return -1;
}
Real-world examples: binary search, operations on balanced trees (AVL, Red-Black), B-Tree lookups (database indexes).
O(n) - Linear
Iterates through every element once.
function sum(arr) {
let total = 0;
for (const value of arr) total += value;
return total;
}
Real-world examples: linear search, array copy, linked list traversal, validating every item in a form.
O(n log n) - Linearithmic
Divides the problem into log n levels and processes n elements at each level. It's the theoretical lower bound for comparison-based sorting algorithms.
function mergeSort(arr) {
if (arr.length <= 1) return arr;
const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid));
const right = mergeSort(arr.slice(mid));
return merge(left, right);
}
function merge(left, right) {
const result = [];
let i = 0;
let j = 0;
while (i < left.length && j < right.length) {
if (left[i] <= right[j]) result.push(left[i++]);
else result.push(right[j++]);
}
return [...result, ...left.slice(i), ...right.slice(j)];
}
Real-world examples: Merge Sort, Quick Sort (average case), Heap Sort, Timsort - the default algorithm behind Arrays.sort() in Java, list.sort() in Python, and Array.prototype.sort() in modern JS engines.
O(n²) - Quadratic
Two nested loops over the input.
function bubbleSort(arr) {
for (let i = 0; i < arr.length; i++) {
for (let j = 0; j < arr.length - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
[arr[j], arr[j + 1]] = [arr[j + 1], arr[j]];
}
}
}
return arr;
}
Real-world examples: Bubble Sort, Insertion Sort, Selection Sort, comparing every pair in a collection, naive duplicate detection.
O(n³) - Cubic
Three nested loops.
function multiplyMatrices(a, b, n) {
const result = Array.from({ length: n }, () => new Array(n).fill(0));
for (let i = 0; i < n; i++) {
for (let j = 0; j < n; j++) {
for (let k = 0; k < n; k++) {
result[i][j] += a[i][k] * b[k][j];
}
}
}
return result;
}
Real-world examples: naive matrix multiplication, Floyd-Warshall (all-pairs shortest paths in a graph).
O(2ⁿ) - Exponential
Each element spawns two branches. Infeasible starting around n ≈ 30.
function fib(n) {
if (n < 2) return n;
return fib(n - 1) + fib(n - 2);
}
Real-world examples: recursive Fibonacci without memoization, brute-force Subset Sum, generating all subsets of a set.
O(n!) - Factorial
Explores every possible permutation. Infeasible already at n ≈ 12.
function permutations(arr) {
if (arr.length <= 1) return [arr];
const result = [];
for (let i = 0; i < arr.length; i++) {
const rest = [...arr.slice(0, i), ...arr.slice(i + 1)];
for (const p of permutations(rest)) {
result.push([arr[i], ...p]);
}
}
return result;
}
Real-world examples: brute-force Traveling Salesman, generating all permutations/anagrams.
Space Complexity
Measures the additional space an algorithm consumes as a function of n. It doesn't count the input itself, only auxiliary memory allocated (variables, created structures, call stack).
O(1) - Constant
Uses a fixed amount of memory, independent of n.
function max(arr) {
let result = -Infinity;
for (const value of arr) {
if (value > result) result = value;
}
return result;
}
Real-world examples: Bubble Sort (in-place), variable swap, counters, two-pointer without extra allocation.
O(log n) - Logarithmic
Common in recursion that splits the problem in half - the call stack grows proportionally to log n.
function binarySearchRecursive(arr, target, left, right) {
if (left > right) return -1;
const mid = Math.floor((left + right) / 2);
if (arr[mid] === target) return mid;
if (arr[mid] < target) return binarySearchRecursive(arr, target, mid + 1, right);
return binarySearchRecursive(arr, target, left, mid - 1);
}
Real-world examples: recursive binary search, Quick Sort (average-case stack space).
O(n) - Linear
Allocates a structure proportional to the input.
function double(arr) {
return arr.map(x => x * 2);
}
Real-world examples: creating a copy of an array, recursion with depth n (recursive factorial), HashMap with n keys, simple memoized Fibonacci.
O(n log n) - Linearithmic
Non in-place Merge Sort allocates temporary arrays at each level of the division.
Real-world examples: Merge Sort with auxiliary buffer, some divide-and-conquer algorithms.
O(n²) - Quadratic
An n × n matrix. Common in dynamic programming over pairs.
function createAdjacencyMatrix(n) {
return Array.from({ length: n }, () => new Array(n).fill(0));
}
Real-world examples: graph adjacency matrix, DP tables for Longest Common Subsequence and Edit Distance.
Growth table
Approximate number of operations for each class as n grows:
| n | O(1) | O(log n) | O(n) | O(n log n) | O(n²) | O(2ⁿ) |
|---|---|---|---|---|---|---|
| 10 | 1 | ~3 | 10 | ~33 | 100 | 1,024 |
| 100 | 1 | ~7 | 100 | ~664 | 10,000 | ~10³⁰ |
| 1,000 | 1 | ~10 | 1,000 | ~9,966 | 1,000,000 | impractical |
| 1,000,000 | 1 | ~20 | 10⁶ | ~2 × 10⁷ | 10¹² (infeasible) | - |
How to analyze in practice
- Sequential loops add up:
O(n) + O(n) = O(n). - Nested loops multiply: two loops of size
n→O(n²). - Constants and lower-order terms are dropped:
O(3n² + 100n + 5) = O(n²). - Worst case is the default, but also consider the amortized case - e.g.,
ArrayList.addin Java is amortizedO(1)despite occasionalO(n)resizes. - Different variables stay separate: loops over collections
aandb→O(a + b), notO(n). This matters when the sizes differ by orders of magnitude. - Recursion: time =
calls × work per call; space = maximum stack depth.
Common pitfalls
Array.includes(),indexOf(),inon an array → O(n), notO(1). For fast lookup useSetorMap.- String concatenation inside a loop in many languages →
O(n²)due to immutability. UseStringBuilder(Java) orarray.join()(JS). slice()/concat()→O(n)per call. Inside a loop, they turn into a hiddenO(n²).- Recursion without memoization (e.g., Fibonacci) → explodes to
O(2ⁿ). With memoization it drops toO(n)time andO(n)space. - I/O and network complexity almost always dominate CPU - optimizing from
O(n²)toO(n log n)is irrelevant if the code makes 1,000 sequential HTTP calls. - Stream pipelines in Java and chained
.map().filter().reduce()in JS hide multiple passes over the collection - each operation isO(n), so the chain becomesO(kn)withk= number of operations.
Quick reference by data structure
| Structure | Access | Search | Insertion | Deletion |
|---|---|---|---|---|
| Array | O(1) | O(n) | O(n) | O(n) |
| Linked List | O(n) | O(n) | O(1)* | O(1)* |
| Hash Table | O(1)† | O(1)† | O(1)† | O(1)† |
| Binary Search Tree | O(log n)† | O(log n)† | O(log n)† | O(log n)† |
| Heap | O(1) top | O(n) | O(log n) | O(log n) |
| Stack / Queue | - | O(n) | O(1) | O(1) |
- Assuming direct access to the node (insertion/removal at a known position). † Average/amortized case. Worst case can be
O(n)for hash tables with heavy collisions and unbalanced BSTs.