Core Algorithms

Time and Space Complexity: Big O Notation Explained

Learn how to analyze algorithm efficiency using Big O notation. Understand time complexity, space complexity, and how to evaluate the performance of your code.

18 min readcomplexitybig-otime complexityspace complexityalgorithms

Why Algorithm Complexity Matters

When designing systems, you'll often need to choose between different algorithms or data structures. Understanding complexity analysis helps you make informed decisions about which approach will scale best as your data grows.

Interview insight: "What's the time complexity of..." is one of the most common interview questions. Master this and you'll impress your interviewers.


Big O Notation Basics

Big O notation describes the worst-case growth rate of an algorithm as the input size increases. It tells us how an algorithm's runtime or memory usage scales, not the exact time it takes.

Common Complexities (Ranked from Best to Worst)

ComplexityNameExample
O(1)ConstantArray index access
O(log n)LogarithmicBinary search
O(n)LinearSimple loop
O(n log n)LinearithmicMerge sort
O(n²)QuadraticNested loops
O(2ⁿ)ExponentialRecursive Fibonacci
O(n!)FactorialPermutations

Analyzing Time Complexity

O(1) — Constant Time

python
# No matter how large the array, this takes the same time
def get_first_element(arr):
    return arr[0]  # O(1)

O(log n) — Logarithmic Time

python
# Binary search: halving the search space each step
def binary_search(arr, target):
    left, right = 0, len(arr) - 1
    while left <= right:
        mid = (left + right) // 2  # O(1)
        if arr[mid] == target:
            return mid
        elif arr[mid] < target:
            left = mid + 1
        else:
            right = mid - 1
    return -1

O(n) — Linear Time

python
# Must check each element once
def find_max(arr):
    max_val = arr[0]
    for num in arr:  # Runs n times
        if num > max_val:
            max_val = num
    return max_val

O(n log n) — Linearithmic Time

python
# Most efficient comparison-based sorting algorithms
def merge_sort(arr):
    if len(arr) <= 1:
        return arr
    mid = len(arr) // 2
    left = merge_sort(arr[:mid])   # O(log n) levels
    right = merge_sort(arr[mid:])  # O(n) work at each level
    return merge(left, right)

O(n²) — Quadratic Time

python
# Nested loops often indicate quadratic complexity
def bubble_sort(arr):
    n = len(arr)
    for i in range(n):          # O(n)
        for j in range(n - i - 1):  # O(n)
            if arr[j] > arr[j + 1]:
                arr[j], arr[j + 1] = arr[j + 1], arr[j]
⚠️

Common mistake: Dropping non-dominant terms. O(n² + n) is O(n²) because as n grows large, n² dominates n.


Space Complexity

Space complexity measures memory usage, not runtime. The same Big O categories apply.

In-Place vs Out-of-Place

python
# Space: O(n) - creates a new array
def double_all(arr):
    return [x * 2 for x in arr]

# Space: O(1) - modifies in place
def double_all_in_place(arr):
    for i in range(len(arr)):
        arr[i] *= 2

Recursive Space

python
# Space: O(n) - each call adds to the call stack
def factorial(n):
    if n <= 1:
        return 1
    return n * factorial(n - 1)

Dropping Constants and Constants

In Big O, we don't care about constants because they don't change the growth rate:

  • O(2n) = O(n) — Constants are dropped
  • O(n + n log n + 1) = O(n log n) — Only keep the dominant term
python
# This is O(n), not O(2n) or O(n + n)
def print_all(arr):
    for item in arr:  # O(n)
        print(item)
    for item in arr:  # O(n)
        print(item)

Best, Average, and Worst Case

Algorithms can have different complexity for different inputs:

python
def search(arr, target):
    for item in arr:
        if item == target:
            return True
    return False

# Best case: O(1) - target is first element
# Worst case: O(n) - target is last or not present
# Average case: O(n)
💡

Big O vs Big Theta vs Big Omega:

  • Big O: Upper bound (worst case)
  • Big Omega: Lower bound (best case)
  • Big Theta: Tight bound (same as both)

What to Remember for Interviews

  1. Know the orders: O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ)
  2. Identify the pattern: Single loop = O(n), nested loop = O(n²), halving = O(log n)
  3. Focus on worst case: Big O describes worst-case growth
  4. Consider both: Time and space complexity matter
  5. Drop the constants: O(2n) is just O(n)

Practice: Look at any piece of code and try to determine its Big O. Start with simple examples and work your way up to complex nested structures.