Fiveable

📊Probability and Statistics Unit 5 Review

QR code for Probability and Statistics practice questions

5.2 Joint probability density functions

5.2 Joint probability density functions

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
📊Probability and Statistics
Unit & Topic Study Guides

Joint probability density functions describe the likelihood of multiple continuous random variables occurring together. They're essential for analyzing relationships between variables in fields like finance, physics, and engineering.

These functions allow us to calculate probabilities, find marginal and conditional distributions, and determine independence between variables. Understanding joint PDFs is crucial for modeling complex systems and making predictions based on multiple interrelated factors.

Definition of joint probability density functions

  • Joint probability density functions (PDFs) are used to describe the probability distribution of two or more continuous random variables
  • Denoted as f(x,y)f(x,y) for two random variables XX and YY, the joint PDF gives the probability density at any point (x,y)(x,y) in the two-dimensional space
  • The probability of the random variables falling within a specific region is given by the double integral of the joint PDF over that region

Properties of joint probability density functions

Non-negative values

  • Joint PDFs are always non-negative for all values of the random variables
    • f(x,y)0f(x,y) \geq 0 for all xx and yy
  • This property ensures that probabilities calculated using joint PDFs are always non-negative and meaningful

Integration to one

  • The double integral of a joint PDF over the entire domain of the random variables must equal one
    • f(x,y)dydx=1\int_{-\infty}^{\infty} \int_{-\infty}^{\infty} f(x,y) \, dy \, dx = 1
  • This property ensures that the total probability of all possible outcomes is one, as required by the axioms of probability

Marginal probability density functions

Definition of marginal probability density functions

  • Marginal PDFs describe the probability distribution of a single random variable, ignoring the others
  • For a joint PDF f(x,y)f(x,y), the marginal PDFs are denoted as fX(x)f_X(x) and fY(y)f_Y(y)
  • Marginal PDFs are obtained by integrating the joint PDF over the domain of the other variable(s)

Relationship to joint probability density functions

  • The marginal PDF of XX is given by fX(x)=f(x,y)dyf_X(x) = \int_{-\infty}^{\infty} f(x,y) \, dy
  • The marginal PDF of YY is given by fY(y)=f(x,y)dxf_Y(y) = \int_{-\infty}^{\infty} f(x,y) \, dx
  • Marginal PDFs can be used to calculate probabilities and moments (mean, variance) for individual random variables

Conditional probability density functions

Definition of conditional probability density functions

  • Conditional PDFs describe the probability distribution of one random variable given the value of another
  • For a joint PDF f(x,y)f(x,y), the conditional PDF of YY given X=xX=x is denoted as fYX(yx)f_{Y|X}(y|x)
  • Conditional PDFs are obtained by dividing the joint PDF by the marginal PDF of the conditioning variable
Non-negative values, Continuous Probability Functions – Adapted By Darlene Young Introductory Statistics

Relationship to joint probability density functions

  • The conditional PDF of YY given X=xX=x is given by fYX(yx)=f(x,y)fX(x)f_{Y|X}(y|x) = \frac{f(x,y)}{f_X(x)}, provided fX(x)>0f_X(x) > 0
  • Similarly, the conditional PDF of XX given Y=yY=y is given by fXY(xy)=f(x,y)fY(y)f_{X|Y}(x|y) = \frac{f(x,y)}{f_Y(y)}, provided fY(y)>0f_Y(y) > 0
  • The joint PDF can be expressed as the product of a marginal PDF and a conditional PDF: f(x,y)=fX(x)fYX(yx)=fY(y)fXY(xy)f(x,y) = f_X(x) \cdot f_{Y|X}(y|x) = f_Y(y) \cdot f_{X|Y}(x|y)

Independence of random variables

Definition of independence

  • Two random variables XX and YY are independent if their joint PDF can be factored into the product of their marginal PDFs
    • f(x,y)=fX(x)fY(y)f(x,y) = f_X(x) \cdot f_Y(y) for all xx and yy
  • Independence implies that the value of one random variable does not affect the probability distribution of the other

Joint probability density functions for independent variables

  • If XX and YY are independent, their joint PDF is simply the product of their marginal PDFs
  • This property simplifies the calculation of probabilities and moments for independent random variables
  • For example, if XN(μX,σX2)X \sim N(\mu_X, \sigma_X^2) and YN(μY,σY2)Y \sim N(\mu_Y, \sigma_Y^2) are independent, their joint PDF is the bivariate normal distribution with zero correlation

Transformations of random variables

Jacobian matrix

  • The Jacobian matrix is used to transform joint PDFs when changing variables
  • For a transformation from (X,Y)(X,Y) to (U,V)(U,V) given by U=g(X,Y)U=g(X,Y) and V=h(X,Y)V=h(X,Y), the Jacobian matrix is:
    • J=[gxgyhxhy]J = \begin{bmatrix} \frac{\partial g}{\partial x} & \frac{\partial g}{\partial y} \\ \frac{\partial h}{\partial x} & \frac{\partial h}{\partial y} \end{bmatrix}
  • The determinant of the Jacobian matrix, J|J|, is used to adjust the joint PDF under the transformation

Transforming joint probability density functions

  • If (X,Y)(X,Y) has a joint PDF f(x,y)f(x,y) and the transformation (U,V)=(g(X,Y),h(X,Y))(U,V) = (g(X,Y), h(X,Y)) is one-to-one, the joint PDF of (U,V)(U,V) is given by:
    • fU,V(u,v)=fX,Y(x(u,v),y(u,v))Jf_{U,V}(u,v) = f_{X,Y}(x(u,v), y(u,v)) \cdot |J|
    • where x(u,v)x(u,v) and y(u,v)y(u,v) are the inverse transformations and J|J| is the absolute value of the determinant of the Jacobian matrix
  • Transformations are useful for simplifying calculations or generating new distributions from known ones

Applications of joint probability density functions

Non-negative values, Beta distribution - wikidoc

Calculating probabilities

  • Joint PDFs are used to calculate the probability of events involving multiple random variables
  • The probability of an event AA is given by the double integral of the joint PDF over the region defined by AA
    • P(A)=Af(x,y)dydxP(A) = \int\int_A f(x,y) \, dy \, dx
  • Examples include calculating the probability that XX and YY fall within a specific rectangle or that X+YX+Y exceeds a certain value

Expectation and variance

  • Joint PDFs are used to calculate the expected values and variances of functions of multiple random variables
  • The expected value of a function g(X,Y)g(X,Y) is given by:
    • E[g(X,Y)]=g(x,y)f(x,y)dydxE[g(X,Y)] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} g(x,y) \cdot f(x,y) \, dy \, dx
  • The variance of g(X,Y)g(X,Y) is given by:
    • Var[g(X,Y)]=E[g(X,Y)2](E[g(X,Y)])2Var[g(X,Y)] = E[g(X,Y)^2] - (E[g(X,Y)])^2
  • These calculations are useful for understanding the behavior and dispersion of functions of random variables

Examples of joint probability density functions

Bivariate normal distribution

  • The bivariate normal distribution is a joint PDF for two normally distributed random variables XX and YY with means μX\mu_X and μY\mu_Y, variances σX2\sigma_X^2 and σY2\sigma_Y^2, and correlation coefficient ρ\rho
  • Its joint PDF is given by:
    • f(x,y)=12πσXσY1ρ2exp(12(1ρ2)[(xμX)2σX22ρ(xμX)(yμY)σXσY+(yμY)2σY2])f(x,y) = \frac{1}{2\pi\sigma_X\sigma_Y\sqrt{1-\rho^2}} \exp\left(-\frac{1}{2(1-\rho^2)}\left[\frac{(x-\mu_X)^2}{\sigma_X^2} - 2\rho\frac{(x-\mu_X)(y-\mu_Y)}{\sigma_X\sigma_Y} + \frac{(y-\mu_Y)^2}{\sigma_Y^2}\right]\right)
  • The bivariate normal distribution is widely used in various fields, such as finance (stock returns) and physics (particle positions)

Uniform distribution over a region

  • A joint PDF can be uniform over a specific region RR in the xyxy-plane, meaning that the probability density is constant within RR and zero outside
  • The joint PDF for a uniform distribution over a region RR with area AA is given by:
    • f(x,y)={1A,(x,y)R0,(x,y)Rf(x,y) = \begin{cases} \frac{1}{A}, & (x,y) \in R \\ 0, & (x,y) \notin R \end{cases}
  • Examples include the uniform distribution over a rectangle (product of two independent uniform distributions) or a circle (used in dartboard problems)

Visualization of joint probability density functions

Contour plots

  • Contour plots are 2D graphs that show the level curves of a joint PDF in the xyxy-plane
  • Each level curve represents points (x,y)(x,y) with the same probability density f(x,y)f(x,y)
  • Contour plots provide a clear view of the shape and concentration of the probability distribution
  • They are particularly useful for visualizing bivariate normal distributions and their correlations

3D surface plots

  • 3D surface plots display the joint PDF as a surface in three-dimensional space, with the xx and yy axes representing the random variables and the zz-axis representing the probability density f(x,y)f(x,y)
  • Surface plots give a more intuitive understanding of the overall shape and peaks of the joint PDF
  • They can be rotated and viewed from different angles to better appreciate the distribution's features
  • Surface plots are helpful for comparing different joint PDFs and understanding their relative concentrations of probability mass
Pep mascot
Upgrade your Fiveable account to print any study guide

Download study guides as beautiful PDFs See example

Print or share PDFs with your students

Always prints our latest, updated content

Mark up and annotate as you study

Click below to go to billing portal → update your plan → choose Yearly → and select "Fiveable Share Plan". Only pay the difference

Plan is open to all students, teachers, parents, etc
Pep mascot
Upgrade your Fiveable account to export vocabulary

Download study guides as beautiful PDFs See example

Print or share PDFs with your students

Always prints our latest, updated content

Mark up and annotate as you study

Plan is open to all students, teachers, parents, etc
report an error
description

screenshots help us find and fix the issue faster (optional)

add screenshot

2,589 studying →