Big O Notation
EasyBig O notation describes how an algorithm's runtime or space requirements grow as the input size increases. It's the language we use to talk about efficiency and is essential for comparing algorithms and making informed decisions about which approach to use.
Interactive Visualization
Y-Axis Scale:
Operations (log scale)
Input Size (n)
Input Size: n = 10
| Complexity | Operations (n=10) | Example |
|---|---|---|
| O(1) | 1 | |
| O(log n) | 3 | |
| O(n) | 10 | |
| O(n log n) | 33 | |
| O(n²) | 100 |
Key Insight: As n grows, the difference between complexities becomes dramatic. At n=50: O(1)=1, O(n)=50, but O(n²)=2,500 operations! (Switch to Linear scale to see the true magnitude difference)
Key Points
- Describes WORST-CASE growth rate, not exact time
- We drop constants: O(2n) → O(n)
- We keep only the dominant term: O(n² + n) → O(n²)
- Common complexities: O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ)
- Space complexity measures memory usage growth
Code Examples
O(1) - Constant
function getFirst(arr) {
return arr[0]
}
// No matter if arr has 10 or 10 million
// elements, we do ONE operation
// Array access by index is O(1)Always same time regardless of input size
O(n) - Linear
function findMax(arr) {
let max = arr[0]
for (let i = 1; i < arr.length; i++) {
if (arr[i] > max) {
max = arr[i]
}
}
return max
}
// Must check EVERY element once
// 10 elements → ~10 operations
// 1000 elements → ~1000 operationsTime grows proportionally with input size
O(n²) - Quadratic
function hasDuplicate(arr) {
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
if (arr[i] === arr[j]) {
return true
}
}
}
return false
}
// Nested loops over same array
// 10 elements → ~100 comparisons
// 1000 elements → ~1,000,000 comparisonsEach element compared with every other element
O(log n) - Logarithmic
function binarySearch(arr, target) {
let left = 0
let right = arr.length - 1
while (left <= right) {
const mid = Math.floor((left + right) / 2)
if (arr[mid] === target) return mid
if (arr[mid] < target) {
left = mid + 1 // Eliminate left half
} else {
right = mid - 1 // Eliminate right half
}
}
return -1
}
// Each step eliminates HALF the elements
// 1000 elements → ~10 steps
// 1,000,000 elements → ~20 stepsHalving the problem each step
O(n log n) - Linearithmic
// Most efficient comparison-based sorts
// like Merge Sort and Quick Sort
function mergeSort(arr) {
if (arr.length <= 1) return arr
const mid = Math.floor(arr.length / 2)
const left = mergeSort(arr.slice(0, mid))
const right = mergeSort(arr.slice(mid))
return merge(left, right)
}
// Divides log(n) times, each level does n work
// This is the "sweet spot" for sortingDivide and conquer - split log(n) times, n work each level
Space Complexity
// O(1) space - in-place
function reverseInPlace(arr) {
let left = 0, right = arr.length - 1
while (left < right) {
[arr[left], arr[right]] = [arr[right], arr[left]]
left++
right--
}
}
// O(n) space - creates new array
function reverseCopy(arr) {
const result = []
for (let i = arr.length - 1; i >= 0; i--) {
result.push(arr[i])
}
return result
}Space complexity tracks additional memory used
Common Mistakes
- Confusing O(n) with exact number of operations
- Forgetting that O(1) doesn't mean "fast", just constant
- Not considering space complexity alongside time
- Thinking O(log n) is always better (constants matter for small n)
Interview Tips
- Always state time AND space complexity of your solution
- Know complexities of common operations (array access, hash lookup, sort)
- Be ready to explain WHY something is O(n²) vs O(n)
- Mention trade-offs: "I can do O(n) time with O(n) space, or O(n²) time with O(1) space"