upgrade
upgrade

๐Ÿ”Data Structures

Dynamic Programming Patterns

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

Dynamic programming (DP) is one of the most heavily tested algorithmic techniques in technical interviews and computer science courses. You're not just being asked to solve problemsโ€”you're being evaluated on your ability to recognize optimal substructure and overlapping subproblems, then choose the right approach to exploit them. Understanding these patterns means the difference between an O(2n)O(2^n) brute-force solution that times out and an O(n2)O(n^2) or O(n)O(n) solution that passes all test cases.

The patterns here fall into distinct categories: foundational approaches (how you structure your DP solution), sequence problems (finding patterns within ordered data), optimization problems (maximizing or minimizing under constraints), and string/comparison problems (measuring similarity or transformation costs). Don't just memorize the recurrence relationsโ€”know which pattern each problem represents and why that pattern applies. When you see a new problem, your first question should be: "Which DP pattern does this resemble?"


Foundational Approaches

These two techniques form the backbone of every dynamic programming solution. The choice between them affects your code structure, space usage, and how you think about the problem.

Top-down Memoization

  • Recursive structure with cachingโ€”breaks the problem into subproblems naturally following the problem's recursive definition
  • Hash table or array storage prevents redundant calculations by checking if a subproblem has already been solved before computing it
  • Reduces exponential to polynomial time in problems with overlapping subproblems; easier to implement when the recurrence relation is complex

Bottom-up Tabulation

  • Iterative table-building starts from base cases and systematically solves all subproblems in dependency order
  • No recursion overhead eliminates stack space and function call costs, making it faster in practice for large inputs
  • Space optimization possible since you can often reduce from O(n2)O(n^2) to O(n)O(n) by only keeping the rows/columns you need

Compare: Top-down vs. Bottom-upโ€”both achieve the same time complexity, but top-down only solves necessary subproblems (better for sparse dependency graphs) while bottom-up avoids recursion limits and enables space optimization. If an interview asks you to optimize space, bottom-up is usually your answer.


Sequence Problems

These patterns identify meaningful subsequences within ordered data. The key insight is that the optimal solution for a sequence ending at position i depends on optimal solutions for earlier positions.

Fibonacci Sequence Pattern

  • Each state depends on fixed previous statesโ€”the recurrence F(n)=F(nโˆ’1)+F(nโˆ’2)F(n) = F(n-1) + F(n-2) is the simplest DP pattern
  • Constant space achievable by tracking only the last two values instead of the entire table
  • Foundation for recognizing DP problems; if you can express the answer in terms of smaller instances, you likely have a DP problem

Longest Increasing Subsequence (LIS)

  • O(n2)O(n^2) basic DP where dp[i] stores the LIS length ending at index i, checking all previous elements
  • O(nlogโกn)O(n \log n) optimization uses binary search with a patience sorting approach to maintain candidate subsequences
  • Pattern recognition keyโ€”any problem asking for "longest subsequence with property X" likely follows this structure

Longest Common Subsequence (LCS)

  • 2D table where dp[i][j] represents LCS length for first i characters of string A and first j characters of string B
  • Recurrence based on character match: if characters match, dp[i][j]=dp[iโˆ’1][jโˆ’1]+1dp[i][j] = dp[i-1][j-1] + 1; otherwise, dp[i][j]=maxโก(dp[iโˆ’1][j],dp[i][jโˆ’1])dp[i][j] = \max(dp[i-1][j], dp[i][j-1])
  • Bioinformatics foundation for DNA sequence alignment; also appears in diff tools and version control

Compare: LIS vs. LCSโ€”LIS operates on a single sequence finding internal order, while LCS compares two sequences for shared structure. Both use O(n2)O(n^2) basic solutions, but LIS has an O(nlogโกn)O(n \log n) optimization that LCS doesn't naturally support.


Optimization Under Constraints

These problems require maximizing or minimizing a value while respecting resource limits. The critical distinction is whether items can be reused.

0/1 Knapsack Problem

  • Binary choice per itemโ€”each item is either included once or excluded, creating a 2D state space of (item index, remaining capacity)
  • Recurrence: dp[i][w]=maxโก(dp[iโˆ’1][w],dp[iโˆ’1][wโˆ’weight[i]]+value[i])dp[i][w] = \max(dp[i-1][w], dp[i-1][w-weight[i]] + value[i])
  • Space reducible to O(W)O(W) by iterating capacity in reverse order; classic example of trading time analysis for space optimization

Unbounded Knapsack Problem

  • Unlimited item copies changes the recurrence to reference the same row: dp[w]=maxโก(dp[w],dp[wโˆ’weight[i]]+value[i])dp[w] = \max(dp[w], dp[w-weight[i]] + value[i])
  • 1D table sufficient since we can reuse items, iterate capacity forward (opposite of 0/1)
  • Coin change variant when all "values" equal 1 and you're counting combinations or minimizing count

Coin Change Problem

  • Minimum coins to reach targetโ€”dp[amount] stores the fewest coins needed to make that amount
  • Recurrence: dp[a]=minโก(dp[a],dp[aโˆ’coin]+1)dp[a] = \min(dp[a], dp[a - coin] + 1) for each valid coin denomination
  • Demonstrates optimal substructure clearly; if you can make amount aa optimally, you can extend to a+coina + coin

Compare: 0/1 Knapsack vs. Unbounded Knapsackโ€”the only difference is item reusability, but this changes iteration order entirely. 0/1 requires reverse capacity iteration (or 2D table) to prevent using an item multiple times; unbounded iterates forward. Interviewers love asking about this distinction.


String Transformation Problems

These patterns measure the cost or possibility of transforming one structure into another. They typically build 2D tables tracking transformation states.

Edit Distance

  • Three operations: insertion, deletion, substitutionโ€”each with cost 1 (or weighted in variants)
  • 2D table dp[i][j] represents minimum edits to transform first i characters of source to first j of target
  • Applications everywhereโ€”spell checkers, DNA alignment, plagiarism detection; also called Levenshtein distance

Matrix Chain Multiplication

  • Optimal parenthesization minimizes total scalar multiplications when multiplying a chain of matrices
  • Interval DP pattern where dp[i][j] represents minimum cost to multiply matrices from index i to j
  • O(n3)O(n^3) complexity from trying all split points; demonstrates that DP isn't always about sequencesโ€”sometimes it's about intervals

Compare: Edit Distance vs. LCSโ€”both use 2D tables on two strings, but edit distance tracks transformation cost while LCS tracks shared structure. You can actually derive one from the other: editย distanceโ‰ฅโˆฃAโˆฃ+โˆฃBโˆฃโˆ’2ร—LCS(A,B)\text{edit distance} \geq |A| + |B| - 2 \times \text{LCS}(A, B).


Quick Reference Table

ConceptBest Examples
Foundational ApproachesTop-down Memoization, Bottom-up Tabulation
Linear RecurrenceFibonacci Sequence Pattern
Subsequence FindingLongest Increasing Subsequence, Longest Common Subsequence
Bounded Resource Optimization0/1 Knapsack Problem
Unbounded Resource OptimizationUnbounded Knapsack, Coin Change Problem
String TransformationEdit Distance
Interval DPMatrix Chain Multiplication
2D Table PatternsLCS, Edit Distance, 0/1 Knapsack

Self-Check Questions

  1. Pattern Recognition: Given a problem asking for the minimum cost to transform sequence A into sequence B using insertions, deletions, and swaps, which DP pattern applies and what would your table dimensions be?

  2. Compare and Contrast: What is the key difference between 0/1 Knapsack and Unbounded Knapsack in terms of table iteration order, and why does this matter?

  3. Optimization Insight: The Fibonacci sequence can be solved in O(1)O(1) space. Which other problems in this guide can have their space complexity reduced from O(n2)O(n^2) or O(n)O(n) to O(1)O(1) or O(n)O(n), and how?

  4. Complexity Analysis: Both LIS and LCS have O(n2)O(n^2) basic solutions. Why can LIS be optimized to O(nlogโกn)O(n \log n) while LCS cannot easily achieve the same improvement?

  5. Application Mapping: If you're building a spell-checker that suggests corrections for misspelled words, which DP pattern would you use, and how would you modify it to also return the actual corrections (not just the distance)?