Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Greedy algorithms represent one of the most elegant paradigms in algorithm design—they solve complex optimization problems by making locally optimal choices at each step, hoping these lead to a globally optimal solution. You're being tested not just on knowing these algorithms exist, but on understanding when and why the greedy approach works. This means recognizing the greedy choice property (a local optimum leads to a global optimum) and optimal substructure (an optimal solution contains optimal solutions to subproblems).
These applications span critical areas: graph algorithms, data compression, scheduling, and resource allocation. Exam questions frequently ask you to prove greedy correctness, analyze time complexity, or identify when greedy fails. Don't just memorize the algorithms—know what structural property each problem exploits and be ready to contrast greedy solutions with dynamic programming alternatives.
Graph-based greedy algorithms exploit the property that building solutions edge-by-edge or vertex-by-vertex can yield optimal results. The key insight is that committing to locally best edges or paths doesn't preclude finding the global optimum.
Compare: Kruskal's vs. Dijkstra's—both use greedy edge selection with priority structures, but MST algorithms consider global edge weights while Dijkstra's tracks cumulative path distances. If an FRQ asks about greedy graph algorithms, clarify whether you're optimizing total weight (MST) or path length (shortest path).
These problems share a common structure: given a set of tasks with constraints, select or order them optimally. The greedy insight is that sorting by the right criterion—usually finish time or deadline—enables optimal sequential selection.
Compare: Activity Selection vs. Job Sequencing—both are scheduling problems, but activity selection maximizes count (sort by finish time) while job sequencing maximizes profit (sort by profit). Know which sorting criterion matches which objective.
Greedy compression algorithms build optimal encodings by prioritizing frequent elements. The mechanism relies on constructing prefix-free codes where no codeword is a prefix of another, enabling unambiguous decoding.
Compare: Huffman Coding vs. Decoding—encoding requires building the tree () while decoding only traverses it ( for bits). FRQs may ask you to construct a Huffman tree from frequencies or trace a decoding sequence.
These problems involve distributing limited resources optimally. Greedy works when items can be ranked by a single efficiency metric and selected accordingly.
Compare: Fractional vs. 0/1 Knapsack—fractional allows greedy (sort by ratio), but 0/1 requires dynamic programming because you can't "undo" taking a whole item. This is a classic exam question on greedy limitations.
Some greedy algorithms solve number-theoretic problems by iteratively selecting the largest valid component.
| Concept | Best Examples |
|---|---|
| Graph optimization (edges/paths) | MST (Kruskal's, Prim's), Dijkstra's |
| Scheduling by finish time | Activity Selection, Interval Scheduling |
| Scheduling by profit | Job Sequencing with Deadlines |
| Data compression | Huffman Coding, Huffman Decoding |
| Resource allocation with divisibility | Fractional Knapsack |
| Greedy works conditionally | Coin Change (canonical systems only) |
| Greedy not optimal | Egyptian Fraction, 0/1 Knapsack, Coin Change (non-canonical) |
| Requires non-negative weights | Dijkstra's Algorithm |
Both Activity Selection and Interval Scheduling use the same greedy strategy. What sorting criterion do they share, and why does it guarantee optimality?
Compare Kruskal's and Prim's algorithms: what data structures does each use, and how do their greedy selection strategies differ?
Why does the greedy approach work for Fractional Knapsack but fail for 0/1 Knapsack? What property does the fractional version have that the 0/1 version lacks?
Given a coin system {1, 3, 4}, explain why greedy fails to find the minimum coins for amount 6. What algorithm would you use instead?
(FRQ-style) Prove that Dijkstra's algorithm fails on graphs with negative edge weights by constructing a small counterexample. Then explain what property of greedy algorithms is violated.