Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Dynamic programming (DP) is one of the most heavily tested algorithm design paradigms in computer science courses—and for good reason. You're being tested on your ability to recognize when a problem exhibits optimal substructure and overlapping subproblems, the two hallmarks that signal DP is the right approach. These problems appear constantly in technical interviews, competitive programming, and real-world optimization scenarios from resource allocation to bioinformatics to natural language processing.
Understanding these classic DP problems isn't about memorizing solutions—it's about recognizing problem patterns. When you see a new problem, you should immediately ask: "Does this look like a knapsack variant? A sequence alignment problem? An optimization over intervals?" Each problem below represents a template that generalizes to dozens of related challenges. Don't just memorize the recurrence relations—know why each problem structure demands dynamic programming and how to identify similar problems on sight.
These problems operate on one or two sequences (strings, arrays) and build solutions by examining prefixes or suffixes. The key insight is that the optimal solution for a sequence depends on optimal solutions for shorter subsequences.
Compare: Longest Increasing Subsequence vs. Longest Common Subsequence—both find non-contiguous subsequences, but LIS operates on one sequence with an ordering constraint while LCS operates on two sequences finding shared elements. If asked to find similarities between two strings, think LCS; if asked about monotonic patterns in one sequence, think LIS.
These problems measure the "distance" or "cost" to transform one string into another. The DP table typically has dimensions matching the two string lengths, with each cell representing the cost for corresponding prefixes.
Compare: Edit Distance vs. Longest Common Subsequence—both use similar 2D DP tables on string pairs, but LCS maximizes similarity while Edit Distance minimizes transformation cost. Edit Distance is the go-to when you need a numeric "difference score" between strings.
These problems require maximizing or minimizing an objective while respecting resource constraints. The key pattern is building optimal solutions by considering whether to include each item.
Compare: Knapsack vs. Subset Sum vs. Coin Change—all involve selecting items to hit a target. Knapsack maximizes value under a weight constraint; Subset Sum asks if exact target is achievable; Coin Change minimizes count to reach exact target. Knapsack has two attributes per item (weight and value), while the others have one.
These problems optimize over ways to partition or combine elements. The insight is that optimal solutions for larger intervals depend on optimal solutions for smaller sub-intervals.
Compare: Matrix Chain Multiplication vs. Rod Cutting—both optimize over ways to partition a structure, but Matrix Chain considers all possible split points in an interval () while Rod Cutting makes one cut at a time from the end (). Matrix Chain is the template for "interval DP" problems.
These problems apply DP principles to graph structures, where optimal paths are built from optimal sub-paths.
Compare: Floyd-Warshall vs. Dijkstra—Floyd-Warshall computes all-pairs shortest paths in and handles negative weights; Dijkstra computes single-source shortest paths in but requires non-negative weights. Choose based on whether you need all pairs and whether negative weights exist.
| Concept | Best Examples |
|---|---|
| Overlapping subproblems | Fibonacci, Rod Cutting, Coin Change |
| Sequence alignment | LCS, Edit Distance |
| Single-sequence optimization | LIS, Subset Sum |
| Constrained optimization | 0/1 Knapsack, Coin Change, Subset Sum |
| Interval DP | Matrix Chain Multiplication, Rod Cutting |
| Graph DP | Floyd-Warshall |
| Decision vs. optimization | Subset Sum (decision), Knapsack (optimization) |
| Pseudo-polynomial time | Knapsack, Subset Sum, Coin Change |
Both the Knapsack Problem and Coin Change Problem involve selecting items to reach a target—what is the key structural difference that changes how you formulate the DP recurrence?
You're given two strings and asked to find how "similar" they are. Which two problems from this guide could apply, and when would you choose one over the other?
Matrix Chain Multiplication and Rod Cutting both optimize over partitions. Why does Matrix Chain require time while Rod Cutting only needs ?
Compare and contrast: How do the DP tables for Longest Common Subsequence and Edit Distance differ in what each cell represents, even though both operate on two strings?
A problem asks whether you can partition an array into two subsets with equal sums. Which classic DP problem is this a direct application of, and what would be your target value?