Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Dynamic programming (DP) is one of the most heavily tested algorithmic techniques in technical interviews and computer science courses. You're not just being asked to solve problemsโyou're being evaluated on your ability to recognize optimal substructure and overlapping subproblems, then choose the right approach to exploit them. Understanding these patterns means the difference between an brute-force solution that times out and an or solution that passes all test cases.
The patterns here fall into distinct categories: foundational approaches (how you structure your DP solution), sequence problems (finding patterns within ordered data), optimization problems (maximizing or minimizing under constraints), and string/comparison problems (measuring similarity or transformation costs). Don't just memorize the recurrence relationsโknow which pattern each problem represents and why that pattern applies. When you see a new problem, your first question should be: "Which DP pattern does this resemble?"
These two techniques form the backbone of every dynamic programming solution. The choice between them affects your code structure, space usage, and how you think about the problem.
Compare: Top-down vs. Bottom-upโboth achieve the same time complexity, but top-down only solves necessary subproblems (better for sparse dependency graphs) while bottom-up avoids recursion limits and enables space optimization. If an interview asks you to optimize space, bottom-up is usually your answer.
These patterns identify meaningful subsequences within ordered data. The key insight is that the optimal solution for a sequence ending at position i depends on optimal solutions for earlier positions.
dp[i] stores the LIS length ending at index i, checking all previous elementsdp[i][j] represents LCS length for first i characters of string A and first j characters of string BCompare: LIS vs. LCSโLIS operates on a single sequence finding internal order, while LCS compares two sequences for shared structure. Both use basic solutions, but LIS has an optimization that LCS doesn't naturally support.
These problems require maximizing or minimizing a value while respecting resource limits. The critical distinction is whether items can be reused.
dp[amount] stores the fewest coins needed to make that amountCompare: 0/1 Knapsack vs. Unbounded Knapsackโthe only difference is item reusability, but this changes iteration order entirely. 0/1 requires reverse capacity iteration (or 2D table) to prevent using an item multiple times; unbounded iterates forward. Interviewers love asking about this distinction.
These patterns measure the cost or possibility of transforming one structure into another. They typically build 2D tables tracking transformation states.
dp[i][j] represents minimum edits to transform first i characters of source to first j of targetdp[i][j] represents minimum cost to multiply matrices from index i to jCompare: Edit Distance vs. LCSโboth use 2D tables on two strings, but edit distance tracks transformation cost while LCS tracks shared structure. You can actually derive one from the other: .
| Concept | Best Examples |
|---|---|
| Foundational Approaches | Top-down Memoization, Bottom-up Tabulation |
| Linear Recurrence | Fibonacci Sequence Pattern |
| Subsequence Finding | Longest Increasing Subsequence, Longest Common Subsequence |
| Bounded Resource Optimization | 0/1 Knapsack Problem |
| Unbounded Resource Optimization | Unbounded Knapsack, Coin Change Problem |
| String Transformation | Edit Distance |
| Interval DP | Matrix Chain Multiplication |
| 2D Table Patterns | LCS, Edit Distance, 0/1 Knapsack |
Pattern Recognition: Given a problem asking for the minimum cost to transform sequence A into sequence B using insertions, deletions, and swaps, which DP pattern applies and what would your table dimensions be?
Compare and Contrast: What is the key difference between 0/1 Knapsack and Unbounded Knapsack in terms of table iteration order, and why does this matter?
Optimization Insight: The Fibonacci sequence can be solved in space. Which other problems in this guide can have their space complexity reduced from or to or , and how?
Complexity Analysis: Both LIS and LCS have basic solutions. Why can LIS be optimized to while LCS cannot easily achieve the same improvement?
Application Mapping: If you're building a spell-checker that suggests corrections for misspelled words, which DP pattern would you use, and how would you modify it to also return the actual corrections (not just the distance)?