Probability and expected values connect calculus with real-world modeling of uncertainty. By extending integration to two dimensions, you can compute probabilities, averages, and measures of spread for pairs of continuous random variables.
This topic covers joint and marginal probability density functions, conditional probability, and expected values/variances, all computed through double integrals.
Joint and Marginal Probability Density Functions
Defining Joint and Marginal Probability Density Functions
A joint probability density function describes how probability is distributed across all possible pairs of values for two continuous random variables and . Think of it as a surface over the -plane whose "volume" under any region gives the probability of landing in that region.
Every valid joint pdf must satisfy two properties:
- Non-negativity: for all and
- Total probability equals 1:
A marginal probability density function isolates one variable by "integrating out" the other. If you only care about , you sum up (integrate) all the contributions from every possible :
Similarly, the marginal pdf of is:
The marginal pdf reduces a two-variable problem back to a single-variable one, which is useful when you need the distribution of just one of the two variables.
Calculating Probabilities Using Double Integrals
To find the probability that lands in some region of the -plane, integrate the joint pdf over that region:
Here is the area element ( or , depending on your order of integration), and is whatever region the problem specifies.
Example: Suppose on the triangular region where and (and elsewhere). To verify this is a valid pdf, check that the total integral equals 1:
To find the probability that, say, and , you'd integrate over the appropriate sub-region of the triangle.
Conditional Probability

Definition and Formula
Conditional probability answers the question: given that takes a particular value , how is distributed?
The conditional pdf of given is:
You're dividing the joint density by the marginal density of the known variable. This "slices" the joint distribution at a fixed and rescales it so it integrates to 1 over .
The symmetric version gives the conditional pdf of given :
Calculating Conditional Probabilities
Once you have the conditional pdf, computing a conditional probability is a single-variable integral.
To find :
- Compute the marginal by integrating over all .
- Form the conditional pdf: .
- Integrate over the desired range of :
The same steps apply (with roles swapped) for .
Expected Value and Variance

Expected Value
The expected value (mean) of a random variable tells you the "center of mass" of its distribution.
For a single variable with pdf :
When you're working with a joint pdf and want the expected value of , you integrate weighted by the joint density over the entire plane:
This is equivalent to computing using the marginal, but sometimes it's easier to work directly with the joint pdf.
More generally, for any function :
This formula is the workhorse for computing variances, covariances, and other quantities.
Variance
Variance measures how spread out a distribution is around its mean :
The computational shortcut is often easier to use in practice:
where .
Steps to compute variance:
- Find .
- Find using the same integration technique but with in the integrand.
- Subtract: .
The standard deviation puts the spread back in the same units as .
Covariance and Correlation
Covariance
Covariance quantifies how two random variables move together. If large values of tend to occur with large values of , the covariance is positive; if large pairs with small , it's negative.
The computational shortcut:
where .
- Positive covariance: and tend to increase together.
- Negative covariance: one tends to increase as the other decreases.
- Zero covariance: no linear relationship (but a nonlinear relationship could still exist).
Correlation Coefficient
Covariance depends on the scale of and , which makes it hard to interpret the magnitude. The correlation coefficient normalizes covariance to a dimensionless number:
Properties of :
- Always satisfies
- : perfect positive linear relationship
- : perfect negative linear relationship
- : no linear relationship (same caveat as covariance about nonlinear dependence)