Why This Matters
Dynamic programming (DP) is one of the most heavily tested algorithm design paradigms in intro algorithms courses. You're being tested on your ability to recognize when a problem exhibits optimal substructure and overlapping subproblems, the two hallmarks that signal DP is the right approach.
Understanding these classic DP problems isn't about memorizing solutions. It's about recognizing problem patterns. When you see a new problem, you should ask: "Does this look like a knapsack variant? A sequence alignment problem? An optimization over intervals?" Each problem below represents a template that generalizes to dozens of related challenges. Don't just memorize the recurrence relations. Know why each problem structure demands dynamic programming and how to spot similar problems on sight.
Sequence-Based Problems
These problems operate on one or two sequences (strings, arrays) and build solutions by examining prefixes or suffixes. The core insight: the optimal solution for a sequence depends on optimal solutions for shorter subsequences.
Fibonacci Sequence
The Fibonacci sequence is the canonical DP example because it makes overlapping subproblems obvious.
- Recurrence: F(n)=F(nโ1)+F(nโ2) with base cases F(0)=0, F(1)=1
- Why DP matters here: Naive recursion recomputes the same values over and over, ballooning to O(2n) time. Storing results (either via memoization or a bottom-up table) cuts this to O(n) time and O(n) space. You can even reduce space to O(1) since you only need the two most recent values.
- Foundational pattern: Whenever you see a recursive solution doing redundant work, that's your signal to apply DP.
Longest Increasing Subsequence
Given an array, find the longest subsequence where each element is strictly greater than the previous. The elements do not need to be contiguous.
- DP approach: Build an array where dp[i] stores the length of the longest increasing subsequence that ends at index i. For each i, check all earlier indices j<i: if a[j]<a[i], then dp[i]=max(dp[i],dp[j]+1). This gives an O(n2) solution.
- Faster approach: There's an O(nlogn) version using binary search with a "tails" array, but the O(n2) DP is what you'll typically need to know for this course.
- Applications: Trend detection in time series data, patience sorting, and as a subroutine in more complex sequence problems.
Longest Common Subsequence
Given two sequences, find the longest subsequence present in both. Unlike substrings, the characters need not be adjacent.
- DP table: dp[i][j] stores the LCS length for the first i characters of sequence A and the first j characters of sequence B. If the characters match (A[i]=B[j]), take dp[iโ1][jโ1]+1. Otherwise, take max(dp[iโ1][j],dp[i][jโ1]). Time and space are both O(mn) where m and n are the sequence lengths.
- Real-world uses: Diff utilities in version control, DNA sequence comparison, plagiarism detection.
Compare: LIS vs. LCS: Both find non-contiguous subsequences, but LIS operates on one sequence with an ordering constraint, while LCS operates on two sequences finding shared elements. If asked to find similarities between two strings, think LCS. If asked about monotonic patterns in one sequence, think LIS.
These problems measure the "distance" or "cost" to transform one string into another. The DP table typically has dimensions matching the two string lengths, with each cell representing the cost for corresponding prefixes.
Edit Distance
Edit distance (also called Levenshtein distance) is the minimum number of single-character insertions, deletions, or substitutions needed to transform string A into string B.
- Recurrence: dp[i][j]=min(dp[iโ1][j]+1,dp[i][jโ1]+1,dp[iโ1][jโ1]+cost) where cost is 0 if A[i]=B[j], and 1 otherwise. The three terms correspond to a deletion, an insertion, and a substitution (or match), respectively.
- Base cases: dp[i][0]=i (delete all of A) and dp[0][j]=j (insert all of B).
- Applications: Spell checkers, DNA mutation analysis, fuzzy string matching in search engines.
Compare: Edit Distance vs. LCS: Both use similar 2D DP tables on string pairs, but LCS maximizes similarity while Edit Distance minimizes transformation cost. Edit Distance is the go-to when you need a numeric "difference score" between strings.
Optimization with Constraints
These problems require maximizing or minimizing an objective while respecting resource constraints. The key pattern: build optimal solutions by considering whether to include each item.
Knapsack Problem
You have n items, each with a weight wiโ and value viโ, and a knapsack with capacity W. Pick items to maximize total value without exceeding the weight limit. In the 0/1 variant, each item is either taken or left; you can't split items.
- Recurrence: dp[i][w]=max(dp[iโ1][w],dp[iโ1][wโwiโ]+viโ) if wiโโคw, otherwise dp[i][w]=dp[iโ1][w]. The first option skips item i; the second takes it.
- Complexity: O(nW), which is pseudo-polynomial. It looks polynomial, but W is a numeric value, not the input size. If W is exponentially large relative to the number of bits used to represent it, this blows up.
- Fractional variant: When items can be divided, a greedy approach (sort by value-to-weight ratio, take greedily) works in O(nlogn). No DP needed.
Coin Change Problem
Given a set of coin denominations and a target amount A, find the fewest coins needed to make exactly A. This is essentially an unbounded knapsack since you can use each denomination multiple times.
- Recurrence: dp[a]=min(dp[aโciโ]+1) for all coin values ciโโคa, with base case dp[0]=0. Initialize all other entries to infinity (meaning "not yet reachable").
- Time complexity: O(Aโ
k) where k is the number of denominations.
- Practical relevance: Currency systems, making change algorithms, resource allocation with discrete units.
Subset Sum Problem
Given a set of integers and a target T, determine whether any subset sums exactly to T. This is a decision problem: it returns true or false, not an optimal value.
- DP table: dp[i][s] is true if sum s is achievable using the first i elements. The recurrence is dp[i][s]=dp[iโ1][s]ORdp[iโ1][sโaiโ] (either skip or include element i). Runs in O(nT) time.
- NP-completeness: Subset Sum is weakly NP-complete. The O(nT) solution is pseudo-polynomial, just like 0/1 Knapsack.
Compare: Knapsack vs. Subset Sum vs. Coin Change: All involve selecting items to hit a target. Knapsack maximizes value under a weight constraint (two attributes per item: weight and value). Subset Sum asks if an exact target is achievable (one attribute per item). Coin Change minimizes count to reach an exact target (with unlimited copies of each item).
Interval and Structure Optimization
These problems optimize over ways to partition or combine elements. The insight: optimal solutions for larger intervals depend on optimal solutions for smaller sub-intervals.
Matrix Chain Multiplication
Given matrices A1โ,A2โ,โฆ,Anโ to multiply, find the parenthesization that minimizes the total number of scalar multiplications. The matrix dimensions determine the cost of each pairwise multiplication.
- Interval DP pattern: dp[i][j] = minimum cost to multiply matrices i through j. You try every possible split point k where iโคk<j, computing dp[i][j]=minkโ(dp[i][k]+dp[k+1][j]+piโ1โโ
pkโโ
pjโ) where p is the dimension array.
- Why it's O(n3): There are O(n2) subproblems (all pairs i,j), and each considers up to O(n) split points.
- Brute force comparison: The number of possible parenthesizations is the Catalan number, which grows exponentially. DP collapses this to O(n3).
Rod Cutting Problem
Given a rod of length n and a price table where length i sells for piโ, find the cuts that maximize total revenue.
- Recurrence: dp[j]=max1โคiโคjโ(piโ+dp[jโi]). You try every possible first cut of length i, then solve the remaining subproblem dp[jโi].
- Complexity: O(n2) since for each length j you consider up to j cut options.
- Classic intro example: Rod Cutting is frequently the first DP problem taught because the recurrence is intuitive and easy to visualize.
Compare: Matrix Chain Multiplication vs. Rod Cutting: Both optimize over partitions, but they differ structurally. Matrix Chain considers all possible split points within an interval (two endpoints to track, giving O(n2) subproblems and O(n3) total). Rod Cutting makes one cut from the end and solves the remainder (one endpoint, giving O(n) subproblems and O(n2) total). Matrix Chain is the template for "interval DP" problems.
Graph-Based Dynamic Programming
These problems apply DP to graph structures, where optimal paths are built from optimal sub-paths.
Shortest Path Problems (Floyd-Warshall)
Floyd-Warshall computes the shortest path between every pair of vertices in a weighted graph.
- DP formulation: dp[i][j][k] = shortest path from vertex i to vertex j using only vertices {1,โฆ,k} as intermediates. The recurrence is dp[i][j][k]=min(dp[i][j][kโ1],dp[i][k][kโ1]+dp[k][j][kโ1]). In practice, you overwrite in place and drop the k dimension, using a single 2D table.
- Complexity: O(V3) time, O(V2) space (with the in-place optimization).
- Handles negative weights: Unlike Dijkstra's algorithm, Floyd-Warshall works with negative edge weights. It only fails if the graph contains a negative-weight cycle (which makes shortest paths undefined).
Compare: Floyd-Warshall vs. Dijkstra: Floyd-Warshall computes all-pairs shortest paths in O(V3) and handles negative weights. Dijkstra computes single-source shortest paths in O(ElogV) but requires non-negative weights. If you need shortest paths from just one source and weights are non-negative, Dijkstra is faster. If you need all pairs or have negative weights, use Floyd-Warshall.
Quick Reference Table
|
| Overlapping subproblems | Fibonacci, Rod Cutting, Coin Change |
| Sequence alignment | LCS, Edit Distance |
| Single-sequence optimization | LIS, Subset Sum |
| Constrained optimization | 0/1 Knapsack, Coin Change, Subset Sum |
| Interval DP | Matrix Chain Multiplication, Rod Cutting |
| Graph DP | Floyd-Warshall |
| Decision vs. optimization | Subset Sum (decision), Knapsack (optimization) |
| Pseudo-polynomial time | Knapsack, Subset Sum, Coin Change |
Self-Check Questions
-
Both the Knapsack Problem and Coin Change Problem involve selecting items to reach a target. What is the key structural difference that changes how you formulate the DP recurrence?
-
You're given two strings and asked to find how "similar" they are. Which two problems from this guide could apply, and when would you choose one over the other?
-
Matrix Chain Multiplication and Rod Cutting both optimize over partitions. Why does Matrix Chain require O(n3) time while Rod Cutting only needs O(n2)?
-
How do the DP tables for Longest Common Subsequence and Edit Distance differ in what each cell represents, even though both operate on two strings?
-
A problem asks whether you can partition an array into two subsets with equal sums. Which classic DP problem is this a direct application of, and what would be your target value?