Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Heaps are the backbone of priority queues and show up constantly in algorithm questions—from implementing Dijkstra's shortest path to understanding why heap sort achieves complexity. You're being tested on more than just knowing what a heap looks like; you need to understand why certain operations take time while others run in , and when to use each operation in real algorithmic contexts.
The key insight is that heaps exploit tree height to achieve efficiency. Every operation you'll study either moves elements up or down the tree (taking time) or processes nodes level by level (sometimes achieving surprising efficiency). Don't just memorize the time complexities—know which direction elements move and why that determines the cost.
These operations create heap structure from scratch or restore it after violations. The core mechanism is comparing parent-child relationships and swapping when the heap property is violated.
Compare: Build Heap vs. repeated Insert—both create a heap, but Build Heap is while insertions cost . If an FRQ asks about the most efficient way to heapify an existing array, Build Heap is always your answer.
These operations add, remove, or modify individual elements while preserving heap structure. The key mechanism is "bubbling"—moving elements up or down until the heap property is restored.
Compare: Insert vs. Increase Key—both can bubble up, but Insert always starts at the bottom while Increase Key operates on an existing element at any position. Decrease Key mirrors Extract Max/Min's bubble-down logic.
These operations process the entire heap or combine multiple heaps. They leverage the heap's structure to achieve efficient bulk processing.
Compare: Heap Sort vs. other sorts—heap sort is in-place (unlike merge sort) but not stable (unlike merge sort). Quick sort is typically faster in practice, but heap sort guarantees worst case.
| Concept | Best Examples |
|---|---|
| Bubble Up (toward root) | Insert, Increase Key |
| Bubble Down (toward leaves) | Heapify, Extract Max/Min, Decrease Key |
| operations | Insert, Delete, Increase/Decrease Key, single Heapify |
| operations | Build Heap |
| operations | Heap Sort, insertions |
| Priority queue essentials | Insert, Extract Max/Min, Increase/Decrease Key |
| Graph algorithm support | Decrease Key (Dijkstra's, Prim's) |
| In-place algorithms | Heap Sort, Build Heap |
Why does Build Heap run in time even though it calls Heapify (an operation) on nodes?
Which two operations use the "bubble up" mechanism, and in what situation would you choose one over the other?
Compare and contrast Heap Sort with Merge Sort in terms of space complexity, stability, and worst-case time complexity.
If you're implementing Dijkstra's algorithm and need to update a vertex's distance, which heap operation do you use and why does it require time?
You have an unsorted array of 1,000 elements and need to create a min-heap. Explain why calling Build Heap is more efficient than inserting elements one by one, and calculate the time complexity difference.