Fiveable

🎛️Optimization of Systems Unit 8 Review

QR code for Optimization of Systems practice questions

8.2 One-dimensional search methods

8.2 One-dimensional search methods

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🎛️Optimization of Systems
Unit & Topic Study Guides

One-dimensional search methods are crucial tools for finding minima in optimization problems. These techniques, including bisection, golden section search, and quadratic interpolation, offer different trade-offs between simplicity, efficiency, and robustness.

Understanding the convergence rates and stopping criteria of these methods is essential for effective implementation. Line search concepts extend these techniques to multi-dimensional problems, while efficiency analysis helps choose the best method for specific applications.

One-Dimensional Search Methods

Comparison of one-dimensional search techniques

  • Bisection method
    • Divides interval in half repeatedly narrows search range efficiently
    • Implementation steps: 1. Choose initial interval 2. Evaluate function at midpoint 3. Select subinterval containing minimum 4. Repeat until convergence
    • Advantages: Simple, robust, guaranteed convergence Disadvantages: Slower than some advanced methods, requires continuous function
  • Golden section search
    • Uses golden ratio (ϕ=1+52\phi = \frac{1+\sqrt{5}}{2}) optimizes interval reduction maintains constant proportion
    • Algorithm: 1. Initialize interval 2. Place interior points using golden ratio 3. Evaluate function 4. Eliminate larger subinterval 5. Repeat
    • Compared to bisection: More efficient fewer function evaluations, doesn't require function to be differentiable
  • Quadratic interpolation
    • Fits quadratic function to three points approximates minimum location
    • Interpolation formula derived using quadratic coefficient calculations
    • Iterative process: 1. Choose initial points 2. Fit quadratic 3. Find minimum 4. Update points 5. Repeat until convergence
    • Faster convergence near minimum assumes smooth, well-behaved functions
Comparison of one-dimensional search techniques, Binary search - Wikipedia

Convergence of one-dimensional search methods

  • Convergence rates
    • Linear convergence: Error reduces by constant factor each iteration (bisection)
    • Superlinear convergence: Error reduction factor improves each iteration (golden section)
    • Quadratic convergence: Error squared each iteration (Newton's method)
  • Stopping criteria
    • Absolute tolerance: Stop when f(xk)f(xk1)<ϵ|f(x_k) - f(x_{k-1})| < \epsilon
    • Relative tolerance: Stop when f(xk)f(xk1)f(xk)<ϵ\frac{|f(x_k) - f(x_{k-1})|}{|f(x_k)|} < \epsilon
    • Interval width: Stop when xkxk1<ϵ|x_k - x_{k-1}| < \epsilon
    • Maximum iterations: Prevent infinite loops ensure termination
  • Factors affecting convergence
    • Initial interval selection impacts speed and accuracy of convergence
    • Function characteristics: Unimodality ensures single minimum, continuity affects method choice
Comparison of one-dimensional search techniques, Golden-section search - Wikipedia

Application to directional optimization problems

  • Line search concept
    • Optimizes along single direction reduces multi-dimensional problem
    • Integrates with methods like gradient descent, Newton's method
  • Bracketing the minimum
    • Initial interval finding: Exponential search, parabolic extrapolation
    • Expanding: Double interval size until minimum bracketed
    • Contracting: Narrow interval using chosen search method
  • Unimodality assumption
    • Unimodal function: Single minimum in given interval
    • Importance: Ensures convergence to global minimum within interval
    • Violation consequences: May converge to local minimum, fail to find global optimum
  • Handling constraints
    • Transformation: Convert constrained to unconstrained problem (logarithmic barrier)
    • Penalty methods: Add penalty term for constraint violations (quadratic penalty)

Efficiency of one-dimensional search algorithms

  • Time complexity analysis
    • Bisection: O(log(1ϵ))O(\log(\frac{1}{\epsilon})) where ϵ\epsilon is desired accuracy
    • Golden section: O(log(1ϵ))O(\log(\frac{1}{\epsilon})) but with better constant factor
    • Quadratic interpolation: O(log(log(1ϵ)))O(\log(\log(\frac{1}{\epsilon}))) under ideal conditions
  • Space complexity considerations
    • Bisection and golden section: O(1)O(1) constant memory
    • Quadratic interpolation: O(1)O(1) stores few points each iteration
  • Trade-offs between accuracy and speed
    • Faster methods (quadratic) may sacrifice robustness
    • Slower methods (bisection) offer guaranteed convergence
  • Numerical stability
    • Rounding errors can accumulate affect accuracy (quadratic interpolation)
    • Improve stability: Use higher precision arithmetic, regularization techniques
  • Performance metrics
    • Function evaluations: Measure computational cost (golden section typically fewer)
    • Iterations: Indicate convergence speed (quadratic often fewer)
    • Achieved accuracy: Final error magnitude relative to true minimum
Pep mascot
Upgrade your Fiveable account to print any study guide

Download study guides as beautiful PDFs See example

Print or share PDFs with your students

Always prints our latest, updated content

Mark up and annotate as you study

Click below to go to billing portal → update your plan → choose Yearly → and select "Fiveable Share Plan". Only pay the difference

Plan is open to all students, teachers, parents, etc
Pep mascot
Upgrade your Fiveable account to export vocabulary

Download study guides as beautiful PDFs See example

Print or share PDFs with your students

Always prints our latest, updated content

Mark up and annotate as you study

Plan is open to all students, teachers, parents, etc
report an error
description

screenshots help us find and fix the issue faster (optional)

add screenshot

2,589 studying →