Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Heaps are the backbone of priority queues and show up constantly in algorithm questions, from implementing Dijkstra's shortest path to understanding why heap sort achieves complexity. You're being tested on more than just knowing what a heap looks like; you need to understand why certain operations take time while others run in , and when to use each operation in real algorithmic contexts.
Heaps exploit tree height to achieve efficiency. Every operation you'll study either moves elements up or down the tree (taking time) or processes nodes level by level (sometimes achieving a surprising bound). Don't just memorize the time complexities. Know which direction elements move and why that determines the cost.
These operations create heap structure from scratch or restore it after violations. The core mechanism is comparing parent-child relationships and swapping when the heap property is violated.
Heapify restores the heap property at a single node. It assumes both subtrees below that node are already valid heaps, then "sift down" (bubble down) the node to its correct position by repeatedly swapping it with its larger child (max-heap) or smaller child (min-heap).
When people say "heapify the whole array," they usually mean calling this sift-down procedure on every non-leaf node in bottom-up order. That process is Build Heap (below).
Build Heap transforms an unsorted array into a valid heap in time. That's surprisingly faster than inserting elements one at a time.
How it works:
Why and not ? Most nodes live near the bottom of the tree. A node at depth only sifts down at most levels (where is the tree height). The leaves (roughly half the nodes) do zero work. The nodes one level above the leaves do at most 1 swap each. Only the root does up to work. When you sum all of this across every node, the series converges to .
Compare: Build Heap vs. repeated Insert: both create a heap, but Build Heap is while insertions cost . If a question asks about the most efficient way to heapify an existing array, Build Heap is always your answer.
These operations add, remove, or modify individual elements while preserving heap structure. The key mechanism is "bubbling": moving elements up or down until the heap property is restored.
Insert adds a new element to the heap while maintaining the heap property.
This "bubble up" process takes worst case since the element might travel from leaf to root. This is the standard way to add elements to a dynamic heap, such as when data arrives one item at a time in a streaming scenario.
Extract Max (max-heap) or Extract Min (min-heap) removes and returns the root, which always holds the highest- or lowest-priority element.
time because the sift-down may traverse the full height. This is the foundation of efficient priority queue dequeue operations.
You can also delete an arbitrary element (not just the root) if you know its index: replace it with the last element, reduce the size, then either bubble up or sift down depending on whether the replacement is larger or smaller than the original.
Increase Key and Decrease Key modify an element's priority in place.
Both run in time. These operations require knowing the element's index in the array, which is why priority queue implementations often maintain a separate index map (a hash map from element to array index).
These are critical for graph algorithms like Dijkstra's and Prim's, where edge relaxation requires updating a vertex's distance (priority) after discovering a shorter path.
Compare: Insert vs. Increase Key: both can bubble up, but Insert always starts at the bottom of the heap, while Increase Key operates on an existing element at any position. Decrease Key (in a max-heap) mirrors Extract Max's sift-down logic.
These operations process the entire heap or combine multiple heaps, leveraging the heap's structure for efficient bulk processing.
Heap Sort is a two-phase, comparison-based sorting algorithm.
Phase 2 performs extractions, each costing , so the total is .
Heap Sort is in-place ( extra space) because extracted elements fill the array from the back. However, it is not stable: equal elements may not preserve their original relative order.
Merge Heaps combines two heaps into one.
This gives an time complexity. More sophisticated heap variants (binomial heaps, Fibonacci heaps) can merge in or even amortized , but for a standard binary heap, the concatenate-and-rebuild approach is what you should know.
A typical use case is combining priority queues from parallel processes or merging sorted streams.
Compare: Heap Sort vs. other sorts: Heap Sort is in-place (unlike Merge Sort, which needs extra space) but not stable (unlike Merge Sort). Quick Sort is typically faster in practice due to better cache behavior, but Heap Sort guarantees worst case while Quick Sort degrades to without careful pivot selection.
| Concept | Best Examples |
|---|---|
| Bubble Up (toward root) | Insert, Increase Key (max-heap) / Decrease Key (min-heap) |
| Sift Down (toward leaves) | Heapify, Extract Max/Min, Decrease Key (max-heap) / Increase Key (min-heap) |
| operations | Insert, Delete, Increase/Decrease Key, single Heapify call |
| operations | Build Heap |
| operations | Heap Sort, individual insertions |
| Priority queue essentials | Insert, Extract Max/Min, Increase/Decrease Key |
| Graph algorithm support | Decrease Key (Dijkstra's, Prim's) |
| In-place algorithms | Heap Sort, Build Heap |
Why does Build Heap run in time even though it calls Heapify (an operation) on nodes?
Which two operations use the "bubble up" mechanism, and in what situation would you choose one over the other?
Compare Heap Sort with Merge Sort in terms of space complexity, stability, and worst-case time complexity.
If you're implementing Dijkstra's algorithm and need to update a vertex's distance, which heap operation do you use and why does it require time?
You have an unsorted array of 1,000 elements and need to create a min-heap. Explain why calling Build Heap is more efficient than inserting elements one by one, and quantify the time complexity difference.