Fiveable

🎲Intro to Statistics Unit 5 Review

QR code for Intro to Statistics practice questions

5.4 Continuous Distribution

5.4 Continuous Distribution

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🎲Intro to Statistics
Unit & Topic Study Guides

Continuous Probability Distributions

Continuous probability distributions model variables that can take any value within a range, like time, height, or weight. Unlike discrete distributions (where you count outcomes), continuous distributions use curves and areas to represent probabilities. This section covers the core tools for working with them: probability density functions, cumulative distribution functions, and the statistics that summarize their shape.

Continuous Probability Distributions

A continuous random variable can take on infinitely many values within some interval. Because of this, you can't assign a probability to a single exact value. Instead, you work with ranges.

A probability density function (PDF) is the mathematical function that defines the shape of a continuous distribution. Two rules govern every PDF:

  • The area under the curve between two points gives the probability of the variable falling in that range.
  • The total area under the entire curve equals 1, since the variable has to take some value.

Continuous distributions show up constantly in practice:

  • Time until an event occurs: how long before a light bulb burns out, or how long between customer arrivals
  • Physical measurements: heights of adults, weights of packages, daily temperatures
  • Measurement errors: how far off a forecast or instrument reading is from the true value

Cumulative and Density Functions

These are the two main functions you'll use to describe and calculate probabilities for continuous distributions.

Probability Density Function (PDF), written f(x)f(x), describes the relative likelihood of each value. A few properties to remember:

  • f(x)0f(x) \geq 0 for all xx (the curve never dips below zero)
  • f(x)dx=1\int_{-\infty}^{\infty} f(x) \, dx = 1 (total area under the curve is 1)
  • The support of a PDF is the set of xx values where f(x)>0f(x) > 0. Outside the support, the function equals zero.

One thing that trips students up: f(x)f(x) itself is not a probability. It's a density. Only the area under the curve over an interval gives you an actual probability.

Cumulative Distribution Function (CDF), written F(x)F(x), gives the probability that the variable is less than or equal to xx:

F(x)=P(Xx)=xf(t)dtF(x) = P(X \leq x) = \int_{-\infty}^{x} f(t) \, dt

The CDF starts at 0 (far left) and climbs to 1 (far right). It's always non-decreasing.

Calculating probabilities using these functions:

  1. Probability between two values: P(a<Xb)=F(b)F(a)=abf(x)dxP(a < X \leq b) = F(b) - F(a) = \int_{a}^{b} f(x) \, dx

  2. Probability at a single point: P(X=x)=0P(X = x) = 0 for any exact value. This is because a single point has no width, so the area under the curve there is zero. That's why continuous probabilities are always about intervals, not exact values.

Continuous probability distributions, The Normal Curve | Boundless Statistics

Interpreting Distribution Statistics

Three statistics summarize the center and spread of a continuous distribution.

Mean (μ\mu) is the expected value, calculated by:

μ=E(X)=xf(x)dx\mu = E(X) = \int_{-\infty}^{\infty} x \, f(x) \, dx

Think of it as the balancing point of the distribution. If you cut the PDF curve out of cardboard, μ\mu is where it would balance on your finger.

Variance (σ2\sigma^2) measures how spread out the values are around the mean:

σ2=E((Xμ)2)=(xμ)2f(x)dx\sigma^2 = E((X - \mu)^2) = \int_{-\infty}^{\infty} (x - \mu)^2 \, f(x) \, dx

A larger variance means the distribution is more spread out. A smaller variance means values cluster tightly around the mean.

Standard deviation (σ\sigma) is the square root of the variance:

σ=σ2\sigma = \sqrt{\sigma^2}

Standard deviation is more intuitive than variance because it's in the same units as the original data. If you're measuring heights in centimeters, σ\sigma is also in centimeters, while σ2\sigma^2 is in "square centimeters."

Practical uses of these statistics:

  • Most values in a distribution fall within 1 to 2 standard deviations of the mean, so μ\mu and σ\sigma together give you a quick sense of the "typical" range.
  • Lower variance means more precise measurements or more consistent outcomes.
  • Comparing μ\mu and σ\sigma across groups tells you whether those groups behave similarly or differently.

Advanced Concepts in Continuous Distributions

These topics go slightly beyond the basics but are worth being aware of:

  • Moment-generating functions provide a systematic way to calculate the mean, variance, and higher moments of a distribution. They also uniquely identify a distribution, meaning two distributions with the same moment-generating function are identical.
  • Transformation of random variables deals with what happens when you apply a function to a random variable. For example, if XX represents a length, what's the distribution of X2X^2 (an area)?
  • Continuity correction applies when you use a continuous distribution to approximate a discrete one. Since discrete distributions assign probability to individual points but continuous distributions don't, you adjust by adding or subtracting 0.5 to bridge the gap. For instance, to approximate P(X=5)P(X = 5) using a continuous distribution, you'd calculate P(4.5<X<5.5)P(4.5 < X < 5.5).