upgrade
upgrade

🧩Intro to Algorithms

Greedy Algorithm Applications

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

Greedy algorithms represent one of the most elegant paradigms in algorithm design—they solve complex optimization problems by making locally optimal choices at each step, hoping these lead to a globally optimal solution. You're being tested not just on knowing these algorithms exist, but on understanding when and why the greedy approach works. This means recognizing the greedy choice property (a local optimum leads to a global optimum) and optimal substructure (an optimal solution contains optimal solutions to subproblems).

These applications span critical areas: graph algorithms, data compression, scheduling, and resource allocation. Exam questions frequently ask you to prove greedy correctness, analyze time complexity, or identify when greedy fails. Don't just memorize the algorithms—know what structural property each problem exploits and be ready to contrast greedy solutions with dynamic programming alternatives.


Graph Optimization Problems

Graph-based greedy algorithms exploit the property that building solutions edge-by-edge or vertex-by-vertex can yield optimal results. The key insight is that committing to locally best edges or paths doesn't preclude finding the global optimum.

Minimum Spanning Tree (Kruskal's and Prim's Algorithms)

  • MST connects all vertices with minimum total edge weight—both algorithms guarantee optimality but use different greedy strategies
  • Kruskal's sorts edges by weight and adds them greedily while avoiding cycles using a Union-Find data structure; runs in O(ElogE)O(E \log E)
  • Prim's grows the tree from a starting vertex, always adding the minimum-weight edge crossing the cut; uses a priority queue for O(ElogV)O(E \log V) complexity

Dijkstra's Shortest Path Algorithm

  • Finds shortest paths from a source to all vertices in graphs with non-negative edge weights only
  • Greedy selection via priority queue—repeatedly extracts the vertex with smallest tentative distance and relaxes its neighbors
  • Time complexity is O((V+E)logV)O((V + E) \log V) with a binary heap; fails on negative weights because the greedy choice becomes invalid

Compare: Kruskal's vs. Dijkstra's—both use greedy edge selection with priority structures, but MST algorithms consider global edge weights while Dijkstra's tracks cumulative path distances. If an FRQ asks about greedy graph algorithms, clarify whether you're optimizing total weight (MST) or path length (shortest path).


Scheduling and Selection Problems

These problems share a common structure: given a set of tasks with constraints, select or order them optimally. The greedy insight is that sorting by the right criterion—usually finish time or deadline—enables optimal sequential selection.

Activity Selection Problem

  • Select maximum non-overlapping activities—the classic greedy scheduling problem with a clean optimality proof
  • Sort by finish time, then greedily select the next activity that starts after the previous one ends
  • Proves greedy correctness via exchange argument—any optimal solution can be transformed to match the greedy choice without losing optimality

Interval Scheduling

  • Equivalent to activity selection—select maximum non-overlapping intervals from a set defined by start and end times
  • Greedy by earliest finish time guarantees leaving maximum room for subsequent intervals
  • Runs in O(nlogn)O(n \log n) due to sorting; the selection phase is linear

Job Sequencing with Deadlines

  • Maximize profit by scheduling jobs before their deadlines—each job has a profit and a deadline constraint
  • Sort jobs by decreasing profit, then assign each to the latest available slot before its deadline
  • Uses a greedy slot-filling strategy—prioritizing high-profit jobs ensures maximum total profit when slots are limited

Compare: Activity Selection vs. Job Sequencing—both are scheduling problems, but activity selection maximizes count (sort by finish time) while job sequencing maximizes profit (sort by profit). Know which sorting criterion matches which objective.


Data Compression

Greedy compression algorithms build optimal encodings by prioritizing frequent elements. The mechanism relies on constructing prefix-free codes where no codeword is a prefix of another, enabling unambiguous decoding.

Huffman Coding

  • Assigns variable-length codes based on character frequency—more frequent characters get shorter codes, minimizing expected message length
  • Builds a binary tree bottom-up by repeatedly merging the two lowest-frequency nodes; runs in O(nlogn)O(n \log n)
  • Produces optimal prefix codes—proven optimal among all prefix-free encoding schemes for known frequency distributions

Huffman Decoding

  • Reconstructs original data by traversing the Huffman tree—read bits and follow left (0) or right (1) branches until reaching a leaf
  • Prefix-free property guarantees unique decodability—no ambiguity in parsing the compressed bitstream
  • Linear time in encoded message length—each bit requires constant work to traverse one tree edge

Compare: Huffman Coding vs. Decoding—encoding requires building the tree (O(nlogn)O(n \log n)) while decoding only traverses it (O(m)O(m) for mm bits). FRQs may ask you to construct a Huffman tree from frequencies or trace a decoding sequence.


Resource Allocation Problems

These problems involve distributing limited resources optimally. Greedy works when items can be ranked by a single efficiency metric and selected accordingly.

Fractional Knapsack Problem

  • Maximize value in a weight-limited knapsack when items can be divided—unlike 0/1 knapsack, fractions are allowed
  • Sort by value-to-weight ratio viwi\frac{v_i}{w_i} and greedily add items until capacity is reached
  • Greedy is optimal here but fails for 0/1 knapsack—the ability to take fractions is what enables the greedy choice property

Coin Change Problem (Canonical Coin Systems)

  • Find minimum coins to make a target amount—greedy works for canonical systems like US coins (1, 5, 10, 25)
  • Always select the largest denomination that fits—repeat until the amount is reached
  • Greedy fails for non-canonical systems—for coins {1, 3, 4}, making 6 greedily gives {4, 1, 1} (3 coins) instead of optimal {3, 3} (2 coins)

Compare: Fractional vs. 0/1 Knapsack—fractional allows greedy (sort by ratio), but 0/1 requires dynamic programming because you can't "undo" taking a whole item. This is a classic exam question on greedy limitations.


Mathematical Applications

Some greedy algorithms solve number-theoretic problems by iteratively selecting the largest valid component.

Egyptian Fraction

  • Represents any fraction as a sum of distinct unit fractions—fractions of the form 1n\frac{1}{n}
  • Greedy algorithm repeatedly subtracts the largest unit fraction that doesn't exceed the remainder: for ab\frac{a}{b}, use 1b/a\frac{1}{\lceil b/a \rceil}
  • Not always optimal in terms of fewest fractions—greedy may produce more terms than the minimum possible representation

Quick Reference Table

ConceptBest Examples
Graph optimization (edges/paths)MST (Kruskal's, Prim's), Dijkstra's
Scheduling by finish timeActivity Selection, Interval Scheduling
Scheduling by profitJob Sequencing with Deadlines
Data compressionHuffman Coding, Huffman Decoding
Resource allocation with divisibilityFractional Knapsack
Greedy works conditionallyCoin Change (canonical systems only)
Greedy not optimalEgyptian Fraction, 0/1 Knapsack, Coin Change (non-canonical)
Requires non-negative weightsDijkstra's Algorithm

Self-Check Questions

  1. Both Activity Selection and Interval Scheduling use the same greedy strategy. What sorting criterion do they share, and why does it guarantee optimality?

  2. Compare Kruskal's and Prim's algorithms: what data structures does each use, and how do their greedy selection strategies differ?

  3. Why does the greedy approach work for Fractional Knapsack but fail for 0/1 Knapsack? What property does the fractional version have that the 0/1 version lacks?

  4. Given a coin system {1, 3, 4}, explain why greedy fails to find the minimum coins for amount 6. What algorithm would you use instead?

  5. (FRQ-style) Prove that Dijkstra's algorithm fails on graphs with negative edge weights by constructing a small counterexample. Then explain what property of greedy algorithms is violated.