Fiveable

๐Ÿ”Data Structures Unit 4 Review

QR code for Data Structures practice questions

4.2 Recursive Problem-Solving Techniques

4.2 Recursive Problem-Solving Techniques

Written by the Fiveable Content Team โ€ข Last updated August 2025
Written by the Fiveable Content Team โ€ข Last updated August 2025
๐Ÿ”Data Structures
Unit & Topic Study Guides

Recursive Problem-Solving Techniques

Recursion solves complex problems by breaking them into smaller subproblems and calling the same function with reduced input until hitting a base case. Mastering these techniques is essential because they underpin major algorithms (merge sort, quick sort, binary search) and are the natural way to traverse trees and graphs.

Recursive Solutions for Classic Problems

Each classic recursive problem follows the same pattern: define a base case that stops the recursion, and a recursive case that moves toward it.

Factorial

The factorial of nn is defined as nร—(nโˆ’1)!n \times (n-1)!, with the base case 0!=10! = 1. Each call multiplies nn by the result of calling itself with nโˆ’1n - 1.

  • factorial(5) = 5ร—5 \times factorial(4) = 5ร—4ร—5 \times 4 \times factorial(3) = ... = 120120

The recursion "unwinds" once it hits the base case, and the multiplications resolve on the way back up the call stack.

Fibonacci Sequence

The Fibonacci sequence is defined as Fn=Fnโˆ’1+Fnโˆ’2F_n = F_{n-1} + F_{n-2}, with base cases F0=0F_0 = 0 and F1=1F_1 = 1. Each call branches into two recursive calls, which is what makes naive Fibonacci so expensive (more on that in the complexity section).

  • fib(5) = fib(4) + fib(3) = (fib(3) + fib(2)) + (fib(2) + fib(1)) = ... = 55

Notice that fib(2) and fib(3) get computed multiple times. This redundancy is why the naive recursive Fibonacci has exponential time complexity, and why memoization or dynamic programming is often used to fix it.

Binary Search

Binary search works on a sorted array by repeatedly cutting the search space in half.

  1. Compare the target to the middle element.
  2. If they match, return the index (base case: element found).
  3. If the target is smaller, recurse on the left half. If larger, recurse on the right half.
  4. If low>high\text{low} > \text{high}, the element isn't in the array (base case: search space exhausted).

Example: Searching for 42 in [1, 5, 17, 42, 98]. The middle element is 17, so you recurse on [42, 98]. The middle is now 42, and you've found the target.

Recursive solutions for classic problems, Calculando factoriales de nรบmeros grandes โ€“ KS7000+WP

Divide-and-Conquer in Recursion

Divide-and-conquer is a general strategy with three steps:

  1. Divide the problem into smaller subproblems.
  2. Conquer each subproblem recursively (base cases handle the smallest subproblems directly).
  3. Combine the subproblem solutions into a solution for the original problem.

The "combine" step is what distinguishes different divide-and-conquer algorithms from each other.

Merge Sort

  1. Divide the array into two halves.
  2. Recursively sort each half (base case: a subarray of size 1 is already sorted).
  3. Merge the two sorted halves by comparing their elements one by one and placing them in order.

The key work happens in the merge step. You walk through both sorted halves with two pointers, always picking the smaller element, which produces a fully sorted result.

Quick Sort

  1. Choose a pivot element (common choices: last element, first element, or a random element).
  2. Partition the array so that all elements smaller than the pivot come before it, and all larger elements come after it.
  3. Recursively sort the subarrays on either side of the pivot (base case: subarrays of size 0 or 1).

Unlike merge sort, quick sort does its heavy lifting in the partition step rather than a merge step. The pivot ends up in its final sorted position after partitioning.

Recursive solutions for classic problems, Aprendendo programaรงรฃo com 5 GIFs animados โ€“ State Of The Art

Recursion for Data Structure Traversal

Trees and graphs have a naturally recursive structure: each node connects to other nodes, which connect to more nodes. Recursion maps directly onto this.

Tree Traversal

For a binary tree, there are three standard recursive traversals. They differ only in when you process the current node relative to its children:

  • Preorder: Visit root โ†’ recurse left โ†’ recurse right
  • Inorder: Recurse left โ†’ visit root โ†’ recurse right
  • Postorder: Recurse left โ†’ recurse right โ†’ visit root

For a tree with root 1, left subtree [2, 4, 5], and right subtree [3, 6, 7]:

  • Preorder yields: [1, 2, 4, 5, 3, 6, 7]
  • Inorder yields: [4, 2, 5, 1, 6, 3, 7]
  • Postorder yields: [4, 5, 2, 6, 7, 3, 1]

Inorder traversal of a binary search tree gives you the elements in sorted order, which is a useful property to remember.

Graph Traversal

  • Depth-first search (DFS) recursively explores as far as possible along one path before backtracking. The recursive function takes the graph, the current node, and a set of visited nodes to prevent infinite loops in graphs with cycles.
  • Breadth-first search (BFS) explores all neighbors at the current depth before moving deeper. BFS is typically implemented with a queue and iteration rather than recursion, though a recursive adaptation is possible.

Example: In a graph where A connects to B and C, B connects to D, DFS starting from A might visit [A, B, D, C] (going deep along Aโ†’Bโ†’D before backtracking to C).

Complexity Analysis of Recursive Algorithms

Time Complexity

To analyze a recursive algorithm, count how many recursive calls are made and how much work each call does.

  • Factorial: nn recursive calls, each doing O(1)O(1) work โ†’ O(n)O(n)
  • Naive Fibonacci: The call tree branches twice at each level, leading to O(2n)O(2^n) time. This is why the naive version is impractical for large nn.
  • Binary search: Each call halves the input โ†’ O(logโกn)O(\log n)
  • Merge sort: Splits into two halves at each level, with O(n)O(n) merge work per level โ†’ O(nlogโกn)O(n \log n)

The Master Theorem is a formal tool for solving these kinds of recurrence relations, which you'll likely encounter if your course covers recurrences in depth.

Space Complexity

Recursive algorithms use stack space proportional to the maximum depth of the call stack.

  • Factorial: The call stack goes nn levels deep โ†’ O(n)O(n) space
  • Binary search: The stack goes logโกn\log n levels deep โ†’ O(logโกn)O(\log n) space
  • Naive Fibonacci: The maximum call depth is nn โ†’ O(n)O(n) space (not O(2n)O(2^n), because branches resolve before new ones open)

Tail recursion is a special case where the recursive call is the very last operation in the function. Some compilers and languages can optimize tail-recursive functions to reuse the same stack frame, effectively turning the recursion into a loop and reducing space to O(1)O(1). Standard factorial is not tail-recursive (it multiplies after the recursive call returns), but it can be rewritten with an accumulator parameter to become tail-recursive.