upgrade
upgrade

🧮Combinatorial Optimization

Key Bin Packing Algorithms

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

Bin packing sits at the heart of combinatorial optimization—it's the classic example of an NP-hard problem that you'll encounter repeatedly in algorithm analysis. When you're tested on approximation algorithms, competitive analysis, or the gap between online and offline computation, bin packing provides the concrete examples you need. Understanding these algorithms means understanding greedy heuristics, sorting-based improvements, approximation ratios, and the fundamental tradeoff between computational efficiency and solution quality.

Don't just memorize which algorithm puts items where—know why each strategy exists and what problem it solves. Can you explain why sorting items first improves performance? Can you compare the competitive ratios of online versus offline approaches? These conceptual connections are what exam questions actually test, especially in FRQ scenarios asking you to justify algorithm selection or analyze worst-case behavior.


Simple Greedy Heuristics

These algorithms make immediate, local decisions without preprocessing. They prioritize speed over optimality, processing each item exactly once in arrival order.

First Fit Algorithm

  • Places each item in the first bin with sufficient capacity—scans bins sequentially until finding a fit
  • Time complexity of O(nlogn)O(n \log n) with efficient data structures (naive implementation is O(n2)O(n^2))
  • Approximation ratio of 1710OPT+2\frac{17}{10} \cdot OPT + 2—guarantees solution within 1.7× optimal in worst case

Next Fit Algorithm

  • Only considers the current bin—opens a new bin when the current one can't fit the next item, never returns to previous bins
  • Fastest option at O(n)O(n) time with constant space, making it ideal for streaming scenarios
  • Worst approximation ratio of 2OPT2 \cdot OPT—can use up to twice the optimal number of bins

Best Fit Algorithm

  • Selects the bin with minimum remaining space that can still accommodate the item—minimizes immediate waste
  • Time complexity of O(nlogn)O(n \log n) using a balanced tree to track bin capacities
  • Same worst-case ratio as First Fit (1710OPT\frac{17}{10} \cdot OPT) but often performs better in practice

Worst Fit Algorithm

  • Places items in the bin with maximum remaining space—keeps large gaps available for future large items
  • Counterintuitive strategy that aims to reduce fragmentation by avoiding tight fits
  • Performance is inconsistent—can match or underperform First Fit depending on input distribution

Compare: First Fit vs. Best Fit—both scan existing bins before opening new ones, but First Fit stops at the first valid bin while Best Fit finds the tightest fit. In practice, Best Fit often wastes less space, but both share the same theoretical worst-case bound. If an FRQ asks about greedy bin packing, First Fit is your simplest correct example.


Sorting-Based Improvements

Preprocessing items by size dramatically improves packing quality. Sorting creates a "large items first" strategy that reduces fragmentation—large items are harder to fit later, so placing them early prevents awkward gaps.

First Fit Decreasing Algorithm

  • Sorts items in decreasing order, then applies First Fit—large items get placed first when bins are empty
  • Time complexity of O(nlogn)O(n \log n) dominated by the sorting step
  • Approximation ratio of 119OPT+69\frac{11}{9} \cdot OPT + \frac{6}{9}—significantly tighter than unsorted First Fit

Best Fit Decreasing Algorithm

  • Combines decreasing sort with Best Fit placement—the gold standard for practical bin packing
  • Achieves the same 119OPT+69\frac{11}{9} \cdot OPT + \frac{6}{9} bound as FFD with slightly better average performance
  • Most commonly recommended heuristic when you have access to all items upfront

Compare: First Fit vs. First Fit Decreasing—same placement logic, but FFD's preprocessing step improves the approximation ratio from 1.7 to about 1.22. This is your go-to example when explaining why sorting matters in approximation algorithms.


Online vs. Offline Paradigms

The distinction between online and offline algorithms is fundamental to competitive analysis. Online algorithms must make irrevocable decisions as items arrive; offline algorithms see the entire input before deciding.

Online Bin Packing Algorithms

  • Make placement decisions without knowledge of future items—each choice is final when made
  • Evaluated using competitive ratio Online CostOptimal Offline Cost\frac{\text{Online Cost}}{\text{Optimal Offline Cost}}—measures worst-case performance gap
  • Lower bound of 53\frac{5}{3} for any deterministic online algorithm—no online strategy can guarantee better than 1.67× optimal

Offline Bin Packing Algorithms

  • Access complete item list before packing begins—enables sorting, optimization, and global planning
  • FFD and BFD are standard offline heuristics achieving near-optimal results in polynomial time
  • Allow arbitrarily close approximations given enough computation time (PTAS exists for bin packing)

Compare: Online vs. Offline approaches—offline algorithms like FFD achieve 119\frac{11}{9} approximation, while the best possible online ratio is 53\frac{5}{3}. This gap quantifies the value of foresight. If asked to analyze a real-time packing system, emphasize that online constraints fundamentally limit achievable performance.


Advanced Theoretical Algorithms

These algorithms provide stronger theoretical guarantees but trade practical simplicity for mathematical elegance. They're essential for understanding approximation theory even if rarely implemented directly.

Harmonic Algorithm

  • Classifies items into size categories based on harmonic intervals (1k+1,1k](\frac{1}{k+1}, \frac{1}{k}]—each class gets dedicated bins
  • Achieves competitive ratio of approximately 1.69 for online bin packing—near the theoretical lower bound
  • Effective for analyzing asymptotic behavior on large inputs where constant factors dominate

Sum of Squares Algorithm

  • Minimizes (bin load)2\sum(\text{bin load})^2 rather than bin count—penalizes uneven distributions
  • Provides provable approximation guarantees through potential function analysis
  • Primarily theoretical significance—demonstrates alternative objective functions for packing problems

Compare: Harmonic vs. simple greedy algorithms—Harmonic achieves a better competitive ratio (1.69 vs. 1.7 for First Fit) by using item classification, but requires more complex implementation. Use this example when discussing how algorithmic sophistication can tighten approximation bounds.


Quick Reference Table

ConceptBest Examples
Simple greedy heuristicsFirst Fit, Next Fit, Best Fit
Sorting-based improvementFirst Fit Decreasing, Best Fit Decreasing
Online algorithmsFirst Fit, Next Fit, Harmonic
Offline algorithmsFFD, BFD, Sum of Squares
Best practical performanceBest Fit Decreasing
Fastest runtimeNext Fit (O(n)O(n))
Tightest approximation ratioFFD/BFD (119OPT\frac{11}{9} \cdot OPT)
Competitive analysis examplesOnline vs. Offline paradigm

Self-Check Questions

  1. Which two algorithms share the same placement logic but differ in preprocessing? Explain why the sorted version achieves a better approximation ratio.

  2. Compare First Fit and Best Fit: What do they have in common, and when might Best Fit outperform First Fit despite having the same worst-case bound?

  3. If items arrive in a streaming fashion and you cannot store them, which algorithm must you use, and what approximation ratio can you guarantee?

  4. FRQ-style: A warehouse management system must pack shipments into containers. You have access to tomorrow's complete order list tonight. Which algorithm would you recommend, and how would your answer change if orders arrived unpredictably throughout the day?

  5. Explain why no deterministic online bin packing algorithm can achieve an approximation ratio better than 53\frac{5}{3}, and identify which algorithm comes closest to this bound.