upgrade
upgrade

🔁Data Structures

Heap Operations

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

Heaps are the backbone of priority queues and show up constantly in algorithm questions—from implementing Dijkstra's shortest path to understanding why heap sort achieves O(nlogn)O(n \log n) complexity. You're being tested on more than just knowing what a heap looks like; you need to understand why certain operations take O(logn)O(\log n) time while others run in O(n)O(n), and when to use each operation in real algorithmic contexts.

The key insight is that heaps exploit tree height to achieve efficiency. Every operation you'll study either moves elements up or down the tree (taking O(logn)O(\log n) time) or processes nodes level by level (sometimes achieving surprising O(n)O(n) efficiency). Don't just memorize the time complexities—know which direction elements move and why that determines the cost.


Building and Restructuring Heaps

These operations create heap structure from scratch or restore it after violations. The core mechanism is comparing parent-child relationships and swapping when the heap property is violated.

Heapify

  • Restores heap property at a single node—assumes subtrees are already valid heaps, then "bubbles down" the node to its correct position
  • Bottom-up traversal starts from the last non-leaf node (index n/21\lfloor n/2 \rfloor - 1) and works toward the root
  • O(logn)O(\log n) per call since the element may traverse the full height of the tree

Build Heap

  • Transforms an unsorted array into a valid heap in O(n)O(n) time—surprisingly faster than inserting nn elements individually
  • Calls heapify on all non-leaf nodes from bottom to top, exploiting the fact that most nodes are near the leaves
  • Mathematical insight: nodes near the bottom do minimal work, and there are fewer nodes near the top—this sum converges to O(n)O(n)

Compare: Build Heap vs. repeated Insert—both create a heap, but Build Heap is O(n)O(n) while nn insertions cost O(nlogn)O(n \log n). If an FRQ asks about the most efficient way to heapify an existing array, Build Heap is always your answer.


Single-Element Operations

These operations add, remove, or modify individual elements while preserving heap structure. The key mechanism is "bubbling"—moving elements up or down until the heap property is restored.

Insert

  • Places new element at the end, then bubbles up—comparing with parent and swapping if the heap property is violated
  • O(logn)O(\log n) worst case since the element might travel from leaf to root
  • Essential for dynamic heaps where elements arrive one at a time, like in streaming data scenarios

Delete (Extract Max/Min)

  • Removes and returns the root, which is always the max (max-heap) or min (min-heap)
  • Replacement strategy: move the last element to root, then bubble down by swapping with the larger (max-heap) or smaller (min-heap) child
  • O(logn)O(\log n) time makes this the foundation of efficient priority queue dequeue operations

Increase/Decrease Key

  • Modifies an element's priority in place—increasing bubbles up, decreasing bubbles down
  • Requires knowing the element's index, which is why priority queues often maintain a separate index map
  • Critical for graph algorithms like Dijkstra's and Prim's, where edge relaxation requires updating priorities

Compare: Insert vs. Increase Key—both can bubble up, but Insert always starts at the bottom while Increase Key operates on an existing element at any position. Decrease Key mirrors Extract Max/Min's bubble-down logic.


Multi-Element and Sorting Operations

These operations process the entire heap or combine multiple heaps. They leverage the heap's structure to achieve efficient bulk processing.

Heap Sort

  • Two-phase algorithm: Build Heap (O(n)O(n)), then repeatedly Extract Max/Min (O(nlogn)O(n \log n) total)
  • In-place sorting with O(1)O(1) extra space—extracted elements fill the array from the end
  • Not stable, meaning equal elements may not preserve their original relative order—important distinction from merge sort

Merge Heaps

  • Combines two heaps into one by concatenating arrays and calling Build Heap on the result
  • O(n+m)O(n + m) naive approach where nn and mm are heap sizes—more sophisticated heap variants (like binomial heaps) can merge in O(logn)O(\log n)
  • Use case: combining priority queues from parallel processes or merging sorted streams

Compare: Heap Sort vs. other O(nlogn)O(n \log n) sorts—heap sort is in-place (unlike merge sort) but not stable (unlike merge sort). Quick sort is typically faster in practice, but heap sort guarantees O(nlogn)O(n \log n) worst case.


Quick Reference Table

ConceptBest Examples
Bubble Up (toward root)Insert, Increase Key
Bubble Down (toward leaves)Heapify, Extract Max/Min, Decrease Key
O(logn)O(\log n) operationsInsert, Delete, Increase/Decrease Key, single Heapify
O(n)O(n) operationsBuild Heap
O(nlogn)O(n \log n) operationsHeap Sort, nn insertions
Priority queue essentialsInsert, Extract Max/Min, Increase/Decrease Key
Graph algorithm supportDecrease Key (Dijkstra's, Prim's)
In-place algorithmsHeap Sort, Build Heap

Self-Check Questions

  1. Why does Build Heap run in O(n)O(n) time even though it calls Heapify (an O(logn)O(\log n) operation) on O(n)O(n) nodes?

  2. Which two operations use the "bubble up" mechanism, and in what situation would you choose one over the other?

  3. Compare and contrast Heap Sort with Merge Sort in terms of space complexity, stability, and worst-case time complexity.

  4. If you're implementing Dijkstra's algorithm and need to update a vertex's distance, which heap operation do you use and why does it require O(logn)O(\log n) time?

  5. You have an unsorted array of 1,000 elements and need to create a min-heap. Explain why calling Build Heap is more efficient than inserting elements one by one, and calculate the time complexity difference.