upgrade
upgrade

🔢Lower Division Math Foundations

Common Mathematical Inequalities

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

Mathematical inequalities aren't just abstract formulas to memorize—they're the workhorses that let mathematicians establish bounds, prove theorems, and solve optimization problems. When you're working through calculus, linear algebra, or probability, you'll constantly need to show that one quantity is larger or smaller than another. These inequalities give you the tools to do exactly that. You're being tested on your ability to recognize when to apply each inequality, understand the conditions required for each to hold, and connect them to broader mathematical structures like norms, convexity, and expected values.

Think of inequalities as a toolkit: some handle geometric relationships, others tame products and sums, and still others give you control over random variables. The key is understanding which tool fits which job. Don't just memorize the formulas—know what mathematical principle each inequality captures and when its conditions are satisfied. That conceptual understanding will serve you far better on exams than rote recall.


Geometric and Metric Inequalities

These inequalities establish fundamental properties of distance and magnitude. They're the foundation for understanding how distances behave in mathematical spaces.

Triangle Inequality

  • Absolute values satisfy a+ba+b|a + b| \leq |a| + |b|—the "length" of a sum never exceeds the sum of lengths
  • Geometric interpretation: in any triangle, the sum of two side lengths must exceed the third side
  • Foundation for metrics—any valid distance function must satisfy this property, making it essential for analysis

Minkowski's Inequality

  • Generalizes the triangle inequality to LpL^p spaces: x+ypxp+yp||x + y||_p \leq ||x||_p + ||y||_p
  • Works for any p1p \geq 1, where the pp-norm is defined as xp=(xip)1/p||x||_p = \left(\sum |x_i|^p\right)^{1/p}
  • Proves that LpL^p spaces are valid vector spaces—without this inequality, we couldn't define proper distances in these function spaces

Compare: Triangle Inequality vs. Minkowski's Inequality—both establish that "distance to a sum ≤ sum of distances," but Minkowski generalizes this to arbitrary pp-norms. If a problem involves LpL^p spaces or weighted norms, reach for Minkowski; for basic absolute values, the triangle inequality suffices.


Product and Sum Bounding Inequalities

These inequalities help you control products by relating them to sums—crucial when you need to bound a complicated product using simpler terms.

Cauchy-Schwarz Inequality

  • The workhorse of linear algebra: (a1b1++anbn)2(a12++an2)(b12++bn2)(a_1b_1 + \cdots + a_nb_n)^2 \leq (a_1^2 + \cdots + a_n^2)(b_1^2 + \cdots + b_n^2)
  • Equivalently written using inner products as u,v2u,uv,v|\langle u, v \rangle|^2 \leq \langle u, u \rangle \cdot \langle v, v \rangle
  • Equality holds exactly when vectors are parallel—use this condition to identify when bounds are tight

Hölder's Inequality

  • Generalizes Cauchy-Schwarz to conjugate exponents: aibi(aip)1/p(biq)1/q\sum |a_i b_i| \leq \left(\sum |a_i|^p\right)^{1/p} \left(\sum |b_i|^q\right)^{1/q} where 1p+1q=1\frac{1}{p} + \frac{1}{q} = 1
  • Cauchy-Schwarz is the special case p=q=2p = q = 2—recognizing this connection helps you choose the right tool
  • Essential for functional analysis and proving integrability results in LpL^p spaces

Young's Inequality

  • Bounds products by sums of powers: abapp+bqqab \leq \frac{a^p}{p} + \frac{b^q}{q} for non-negative a,ba, b and conjugates 1p+1q=1\frac{1}{p} + \frac{1}{q} = 1
  • Often used to prove Hölder's inequality—it's lower-level in the hierarchy of tools
  • Key technique: when you see a product you need to bound, try splitting it using Young's with strategic exponent choices

Compare: Cauchy-Schwarz vs. Hölder's—Cauchy-Schwarz uses the L2L^2 norm (sum of squares), while Hölder's works for any conjugate pair (p,q)(p, q). Default to Cauchy-Schwarz unless the problem specifically involves other LpL^p norms.


Mean and Averaging Inequalities

These inequalities relate different types of averages and reveal how convexity shapes the behavior of functions applied to means.

Arithmetic Mean-Geometric Mean (AM-GM) Inequality

  • For non-negative reals: x1+x2++xnn(x1x2xn)1/n\frac{x_1 + x_2 + \cdots + x_n}{n} \geq (x_1 x_2 \cdots x_n)^{1/n}—arithmetic mean beats geometric mean
  • Equality holds when all values are equal—this is your key to optimization problems
  • Go-to tool for optimization: when maximizing a product subject to a sum constraint (or vice versa), AM-GM often gives the answer directly

Jensen's Inequality

  • For convex functions ff: f(x1++xnn)f(x1)++f(xn)nf\left(\frac{x_1 + \cdots + x_n}{n}\right) \leq \frac{f(x_1) + \cdots + f(x_n)}{n}
  • Inequality reverses for concave functions—always check the curvature before applying
  • Explains why AM-GM works: since ln(x)-\ln(x) is convex, Jensen's applied to logarithms yields AM-GM

Compare: AM-GM vs. Jensen's—AM-GM is actually a special case of Jensen's (applied to the convex function lnx-\ln x). Use AM-GM for quick product/sum problems; use Jensen's when you're working with a specific convex or concave function and need to relate f(average)f(\text{average}) to average of f\text{average of } f.


Growth and Approximation Inequalities

This inequality captures how exponential-type growth behaves, particularly useful for approximations and induction proofs.

Bernoulli's Inequality

  • For x1x \geq -1 and integer n0n \geq 0: (1+x)n1+nx(1 + x)^n \geq 1 + nx
  • Provides a linear lower bound on exponential growth—the bound is tight when xx is small
  • Perfect for induction proofs and establishing limits; often appears when bounding (1+1n)n(1 + \frac{1}{n})^n type expressions

Probabilistic Bounding Inequalities

These inequalities let you control probabilities without knowing the full distribution—essential for worst-case analysis in statistics and probability.

Markov's Inequality

  • For non-negative random variable XX and a>0a > 0: P(Xa)E[X]aP(X \geq a) \leq \frac{E[X]}{a}
  • Requires only the expected value—no variance or distribution shape needed
  • Often gives weak bounds but works universally; it's your baseline tool when you know almost nothing about XX

Chebyshev's Inequality

  • Bounds deviation from the mean: P(Xμkσ)1k2P(|X - \mu| \geq k\sigma) \leq \frac{1}{k^2}
  • Uses variance information to give tighter bounds than Markov—the tradeoff is you need to know σ2\sigma^2
  • Distribution-free guarantee: no matter the shape, at least 11k21 - \frac{1}{k^2} of probability lies within kk standard deviations

Compare: Markov's vs. Chebyshev's—Markov requires only E[X]E[X] and works for any non-negative variable; Chebyshev requires E[X]E[X] and Var(X)\text{Var}(X) but gives tighter bounds on deviation from the mean. If a problem gives you variance, use Chebyshev; if you only have the expected value, Markov is your only option.


Quick Reference Table

ConceptBest Examples
Distance/metric propertiesTriangle Inequality, Minkowski's Inequality
Bounding products by sumsCauchy-Schwarz, Hölder's, Young's Inequality
Comparing meansAM-GM Inequality, Jensen's Inequality
Convexity applicationsJensen's Inequality
Exponential growth boundsBernoulli's Inequality
Probability tail boundsMarkov's Inequality, Chebyshev's Inequality
LpL^p space structureHölder's, Minkowski's Inequality
OptimizationAM-GM Inequality, Cauchy-Schwarz

Self-Check Questions

  1. Both Cauchy-Schwarz and Hölder's inequality bound sums of products. What condition on the exponents makes Cauchy-Schwarz a special case of Hölder's?

  2. You're given a non-negative random variable with known mean but unknown variance. Which inequality can you use to bound P(Xa)P(X \geq a), and what would you need to know to get a tighter bound?

  3. Compare and contrast AM-GM and Jensen's inequality: How is AM-GM derived from Jensen's, and when would you prefer one over the other?

  4. The triangle inequality and Minkowski's inequality both establish that "the norm of a sum is at most the sum of norms." What distinguishes their domains of application?

  5. If an exam problem asks you to prove that a product abab is bounded above by a sum of powers of aa and bb, which inequality should you reach for first, and what condition must the exponents satisfy?