Fiveable

🎲Intro to Probabilistic Methods Unit 6 Review

QR code for Intro to Probabilistic Methods practice questions

6.3 Transformations of random variables

6.3 Transformations of random variables

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🎲Intro to Probabilistic Methods
Unit & Topic Study Guides

Transformations of random variables are crucial in probability theory. They allow us to manipulate and analyze random variables in different ways, opening up new possibilities for modeling and problem-solving.

Linear transformations scale and shift random variables, affecting their mean and variance. Non-linear transformations can change the shape of probability distributions entirely, requiring special techniques to derive new distributions from existing ones.

Linear Transformations of Random Variables

Scaling and Shifting Random Variables

  • Linear transformations involve scaling and shifting the original random variable
  • If XX is a random variable and aa and bb are constants, the linear transformation is Y=aX+bY = aX + b
  • Scaling a random variable by a constant factor aa changes the spread of the distribution
    • If a>1|a| > 1, the distribution is stretched (wider spread)
    • If a<1|a| < 1, the distribution is compressed (narrower spread)
  • Shifting a random variable by a constant value bb changes the location of the distribution
    • Positive bb shifts the distribution to the right
    • Negative bb shifts the distribution to the left

Effects on Mean and Variance

  • The mean of the transformed random variable YY is related to the mean of XX by E[Y]=aE[X]+bE[Y] = aE[X] + b, where E[X]E[X] is the mean of XX
    • The constant aa scales the mean of XX
    • The constant bb shifts the mean of XX
  • The variance of the transformed random variable YY is related to the variance of XX by Var(Y)=a2Var(X)Var(Y) = a^2 Var(X), where Var(X)Var(X) is the variance of XX
    • The constant aa scales the variance of XX by a factor of a2a^2
    • The constant bb does not affect the variance
  • Linear transformations preserve the shape of the probability distribution
    • The location (mean) and scale (variance) of the distribution may change
    • Example: If XX follows a normal distribution, Y=aX+bY = aX + b will also follow a normal distribution with a different mean and variance

Non-linear Transformations of Random Variables

Scaling and Shifting Random Variables, Normal Random Variables (6 of 6) | Concepts in Statistics

Applying Non-linear Functions to Random Variables

  • Non-linear transformations involve applying a non-linear function to the original random variable
  • If XX is a random variable and g(x)g(x) is a non-linear function, the transformed random variable is Y=g(X)Y = g(X)
  • Examples of non-linear transformations:
    • Exponential function: Y=eXY = e^X
    • Logarithmic function: Y=log(X)Y = \log(X)
    • Power function: Y=XnY = X^n, where nn is a constant
  • Non-linear transformations can change the shape of the probability distribution

Deriving the Probability Distribution of Transformed Variables

  • To find the probability distribution of YY, determine the cumulative distribution function (CDF) of YY, denoted as FY(y)F_Y(y), using the CDF of XX, denoted as FX(x)F_X(x)
  • The relationship between the CDFs is FY(y)=P(Yy)=P(g(X)y)F_Y(y) = P(Y \leq y) = P(g(X) \leq y)
  • For monotonically increasing functions g(x)g(x), the CDF of YY is FY(y)=FX(g1(y))F_Y(y) = F_X(g^{-1}(y)), where g1(y)g^{-1}(y) is the inverse function of g(x)g(x)
  • For monotonically decreasing functions g(x)g(x), the CDF of YY is FY(y)=1FX(g1(y))F_Y(y) = 1 - F_X(g^{-1}(y))
  • The probability density function (PDF) of YY, denoted as fY(y)f_Y(y), is obtained by differentiating the CDF of YY with respect to yy
    • fY(y)=ddyFY(y)f_Y(y) = \frac{d}{dy}F_Y(y)
    • The PDF of YY can also be expressed in terms of the PDF of XX, denoted as fX(x)f_X(x), using the change of variables technique: fY(y)=fX(g1(y))ddyg1(y)f_Y(y) = f_X(g^{-1}(y)) \cdot |\frac{d}{dy}g^{-1}(y)|

Jacobian Determinant in Multivariate Transformations

Scaling and Shifting Random Variables, Characteristics of Linear Functions and Their Graphs | Intermediate Algebra

Definition and Role of the Jacobian Determinant

  • The Jacobian determinant is a matrix of partial derivatives used when transforming multivariate random variables
  • Given a vector of random variables X=(X1,X2,...,Xn)X = (X_1, X_2, ..., X_n) and a vector of transformed variables Y=(Y1,Y2,...,Yn)Y = (Y_1, Y_2, ..., Y_n), where each YiY_i is a function of X1,X2,...,XnX_1, X_2, ..., X_n, the Jacobian matrix JJ is defined as Jij=YiXjJ_{ij} = \frac{\partial Y_i}{\partial X_j}
  • The Jacobian determinant, denoted as J|J|, is the absolute value of the determinant of the Jacobian matrix JJ
  • The Jacobian determinant represents the volume change factor when transforming from the XX-space to the YY-space
    • If J>1|J| > 1, the volume expands during the transformation
    • If J<1|J| < 1, the volume contracts during the transformation

Calculating the Jacobian Determinant

  • To calculate the Jacobian determinant, first determine the Jacobian matrix JJ by finding the partial derivatives of each transformed variable YiY_i with respect to each original variable XjX_j
  • Arrange the partial derivatives in a square matrix, with each row corresponding to a transformed variable and each column corresponding to an original variable
  • Calculate the determinant of the Jacobian matrix using standard matrix determinant techniques (e.g., cofactor expansion, Laplace expansion, or Gaussian elimination)
  • Take the absolute value of the determinant to obtain the Jacobian determinant J|J|
  • Example: For a transformation from polar coordinates (R,Θ)(R, \Theta) to Cartesian coordinates (X,Y)(X, Y), where X=Rcos(Θ)X = R \cos(\Theta) and Y=Rsin(Θ)Y = R \sin(\Theta), the Jacobian matrix is J=[cos(Θ)Rsin(Θ)sin(Θ)Rcos(Θ)]J = \begin{bmatrix} \cos(\Theta) & -R \sin(\Theta) \\ \sin(\Theta) & R \cos(\Theta) \end{bmatrix}, and the Jacobian determinant is J=R|J| = R

Joint Distribution of Transformed Variables

Multivariate Change of Variables Technique

  • The multivariate change of variables technique is used to find the joint probability density function (PDF) of transformed random variables
  • Given a vector of random variables X=(X1,X2,...,Xn)X = (X_1, X_2, ..., X_n) with joint PDF fX(x1,x2,...,xn)f_X(x_1, x_2, ..., x_n) and a vector of transformed variables Y=(Y1,Y2,...,Yn)Y = (Y_1, Y_2, ..., Y_n), the joint PDF of YY, denoted as fY(y1,y2,...,yn)f_Y(y_1, y_2, ..., y_n), is given by fY(y1,y2,...,yn)=fX(x1,x2,...,xn)Jf_Y(y_1, y_2, ..., y_n) = f_X(x_1, x_2, ..., x_n) \cdot |J|, where J|J| is the absolute value of the Jacobian determinant
  • The multivariate change of variables technique requires the transformation to be one-to-one and the inverse transformation to be differentiable

Steps to Apply the Multivariate Change of Variables Technique

  1. Express the original variables (X1,X2,...,Xn)(X_1, X_2, ..., X_n) in terms of the transformed variables (Y1,Y2,...,Yn)(Y_1, Y_2, ..., Y_n)
  2. Calculate the Jacobian matrix JJ by finding the partial derivatives of each original variable with respect to each transformed variable
  3. Calculate the Jacobian determinant J|J| by taking the absolute value of the determinant of the Jacobian matrix
  4. Substitute the expressions for the original variables and the Jacobian determinant into the joint PDF of XX
  5. Simplify the resulting expression to obtain the joint PDF of YY
  • Example: For a transformation from Cartesian coordinates (X,Y)(X, Y) to polar coordinates (R,Θ)(R, \Theta), where R=X2+Y2R = \sqrt{X^2 + Y^2} and Θ=arctan(Y/X)\Theta = \arctan(Y/X), the joint PDF of (R,Θ)(R, \Theta) is fR,Θ(r,θ)=fX,Y(rcos(θ),rsin(θ))rf_{R,\Theta}(r, \theta) = f_{X,Y}(r \cos(\theta), r \sin(\theta)) \cdot r, where fX,Y(x,y)f_{X,Y}(x, y) is the joint PDF of (X,Y)(X, Y)