upgrade
upgrade

🧵Programming Languages and Techniques I

Basic Sorting Algorithms

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

Sorting algorithms are the foundation of computational thinking—they're not just about putting numbers in order, but about understanding tradeoffs between time, space, and implementation complexity. When you're tested on sorting, you're really being tested on your ability to analyze algorithmic efficiency, recognize when one approach outperforms another, and understand core paradigms like divide-and-conquer versus incremental construction.

Don't just memorize that Bubble Sort is O(n2)O(n^2)—know why it's slow (nested comparisons), when it might actually be acceptable (nearly sorted data), and how it compares to algorithms that use smarter strategies. The real exam questions will ask you to choose the right algorithm for a scenario, trace through execution, or explain why one approach beats another. Master the concepts, and the facts will stick.


Simple Iterative Sorts

These algorithms use straightforward loops and comparisons, making them easy to implement but inefficient at scale. They compare or move elements one at a time without breaking the problem into smaller subproblems.

Bubble Sort

  • Repeatedly swaps adjacent elements if they're in the wrong order—the largest unsorted value "bubbles" to the end each pass
  • Time complexity: O(n2)O(n^2) average and worst case, but can exit early if no swaps occur (best case O(n)O(n) with optimization)
  • Stable sort—equal elements maintain their original relative order, useful when sorting by multiple keys

Selection Sort

  • Finds the minimum element in the unsorted region and swaps it to the front—divides array into sorted and unsorted sections
  • Time complexity: O(n2)O(n^2) for all cases—always performs the same number of comparisons regardless of input order
  • Not stable but in-place—requires no extra memory, but may reorder equal elements during swaps

Compare: Bubble Sort vs. Selection Sort—both are O(n2)O(n^2) and work well only on small datasets, but Bubble Sort is stable while Selection Sort is not. Selection Sort makes fewer swaps (exactly n1n-1), so if swap cost is high, it may perform better in practice.

Insertion Sort

  • Builds sorted array incrementally—takes each element and inserts it into its correct position among previously sorted elements
  • Adaptive performance: O(n)O(n) best case when data is already sorted, O(n2)O(n^2) worst case—excellent for nearly sorted input
  • Stable and in-place—preserves equal element order while using only constant extra memory

Compare: Insertion Sort vs. Bubble Sort—both are stable O(n2)O(n^2) algorithms, but Insertion Sort is significantly faster on nearly sorted data and generally preferred. If an FRQ asks which simple sort works best on "almost sorted" input, Insertion Sort is your answer.


Divide-and-Conquer Sorts

These algorithms break the problem into smaller subproblems, solve them recursively, and combine results. This strategy achieves O(nlogn)O(n \log n) time by reducing the number of comparisons needed.

Merge Sort

  • Splits array in half recursively, sorts each half, then merges sorted halves back together using a two-pointer technique
  • Guaranteed O(nlogn)O(n \log n) for all cases—no worst-case degradation, making it predictable and reliable
  • Stable but not in-place—requires O(n)O(n) auxiliary space for temporary arrays during merging

Quick Sort

  • Partitions around a pivot—elements smaller than pivot go left, larger go right, then recursively sorts each partition
  • Average O(nlogn)O(n \log n), worst case O(n2)O(n^2)—poor pivot selection (e.g., already sorted input with first-element pivot) causes degradation
  • In-place but not stable—uses O(logn)O(\log n) stack space for recursion; often fastest in practice due to cache efficiency

Compare: Merge Sort vs. Quick Sort—both use divide-and-conquer and average O(nlogn)O(n \log n), but Merge Sort guarantees this performance while Quick Sort can degrade to O(n2)O(n^2). However, Quick Sort is in-place and typically faster in practice. Choose Merge Sort when stability or guaranteed performance matters; choose Quick Sort when memory is limited.


Quick Reference Table

ConceptBest Examples
O(n2)O(n^2) time complexityBubble Sort, Selection Sort, Insertion Sort
O(nlogn)O(n \log n) time complexityMerge Sort, Quick Sort (average)
Stable sortingBubble Sort, Insertion Sort, Merge Sort
In-place sortingSelection Sort, Insertion Sort, Quick Sort
Divide-and-conquer paradigmMerge Sort, Quick Sort
Best for nearly sorted dataInsertion Sort
Guaranteed worst-case performanceMerge Sort
Best practical performanceQuick Sort

Self-Check Questions

  1. Which two simple sorts are both O(n2)O(n^2) but differ in stability? What makes one stable and the other not?

  2. You need to sort a large dataset where memory is limited and average-case performance matters most. Which algorithm would you choose and why?

  3. Compare and contrast Merge Sort and Quick Sort: What do they share in strategy, and what are the key tradeoffs between them?

  4. A dataset is already 90% sorted. Which algorithm would perform closest to O(n)O(n) and what property makes this possible?

  5. If you needed to sort student records by grade (preserving alphabetical order for students with the same grade), which O(nlogn)O(n \log n) algorithm would you use and why?