Mid-Level Fundamentals 7 min read

Sorting Algorithms Compared — When to Use Which

The Interview Question

Compare quicksort, merge sort, and heap sort. When would you choose each one?

Expert Answer

All three are O(n log n) comparison-based sorts, but their tradeoffs make each suited to different scenarios. Quicksort picks a pivot, partitions elements around it, and recurses on each side. Average case O(n log n), worst case O(n²) with bad pivot selection — but in practice it's the fastest due to cache locality (in-place, sequential memory access). Use quicksort as your default for in-memory sorting — it's what most standard library sort functions use (with optimizations like introsort). Merge sort divides the array in half, recursively sorts both halves, and merges. It's always O(n log n) — guaranteed, no worst case — but requires O(n) extra space for merging. Use merge sort when you need stable sorting (equal elements maintain original order) or when sorting linked lists (merge is natural on linked structures, no random access needed). Heap sort builds a max-heap and repeatedly extracts the maximum. It's O(n log n) worst case and O(1) extra space, but has poor cache performance due to non-sequential memory access. Use heap sort when you need guaranteed O(n log n) with no extra memory. For non-comparison sorts: counting sort is O(n + k) when the range k is small, radix sort is O(d * n) for d-digit numbers.

Key Points to Hit in Your Answer

  • Quicksort: fastest in practice (cache-friendly), O(n²) worst case, not stable
  • Merge sort: guaranteed O(n log n), stable, O(n) space, great for linked lists
  • Heap sort: guaranteed O(n log n), O(1) space, poor cache locality
  • Python's sorted() uses Timsort — hybrid merge+insertion sort, stable, O(n log n)
  • Counting/radix sort break the O(n log n) comparison sort lower bound
  • Stable sort: equal elements keep their original relative order
  • n log n is the theoretical lower bound for comparison-based sorting

Code Example

# Quicksort with random pivot (avoids worst case)
import random

def quicksort(arr):
    if len(arr) <= 1: return arr
    pivot = arr[random.randint(0, len(arr) - 1)]
    left = [x for x in arr if x < pivot]
    middle = [x for x in arr if x == pivot]
    right = [x for x in arr if x > pivot]
    return quicksort(left) + middle + quicksort(right)

# Merge sort
def merge_sort(arr):
    if len(arr) <= 1: return arr
    mid = len(arr) // 2
    left = merge_sort(arr[:mid])
    right = merge_sort(arr[mid:])
    return merge(left, right)

def merge(left, right):
    result = []
    i = j = 0
    while i < len(left) and j < len(right):
        if left[i] <= right[j]:  # <= for stability
            result.append(left[i]); i += 1
        else:
            result.append(right[j]); j += 1
    result.extend(left[i:])
    result.extend(right[j:])
    return result

What Interviewers Are Really Looking For

Know the tradeoffs cold: quicksort for speed, merge sort for stability and guarantees, heap sort for space. The follow-up 'why is quicksort faster in practice despite same Big O?' — cache locality and smaller constants. Knowing that Python uses Timsort (and what it is) shows awareness of real implementations. The n log n lower bound for comparison sorts is a common theory question.

Practice This Question with AI Grading

Reading about interview questions is a start — but practicing with real-time AI feedback is how you actually get better. Goliath Prep grades your answers instantly and tells you exactly what you're missing.

Start Practicing Free →