Transformations of random variables
Transforming a random variable means applying a function to it, producing a new random variable with a different distribution. This is one of the core skills in stochastic processes: if you know the distribution of , and you define , you need to figure out the distribution of . The main techniques for doing this are the CDF method, the MGF method, and for sums, convolution.
Functions of Random Variables
Discrete vs. continuous functions
The approach you take depends on whether you're working with discrete or continuous random variables.
- Discrete case: If takes values in a countable set, then is also discrete. You find its PMF by collecting all input values that map to the same output and summing their probabilities.
- Continuous case: If has a PDF, then is typically continuous (though not always). You'll use the CDF technique or the change-of-variables formula to find the PDF of .
Probability distribution of functions
For discrete random variables, the PMF of is:
You're grouping together every value that lands on the same , then adding up their probabilities.
For continuous random variables, you generally can't just "plug in" to the PDF. Instead, you work through the CDF first (described next) or use the MGF approach.
Cumulative Distribution Function Technique
Deriving CDFs from transformations
The CDF method is the most general approach and works for virtually any transformation. Here's the procedure:
- Start with and write the CDF definition: .
- Manipulate the inequality to isolate . For example, if is strictly increasing, this becomes .
- Express the result in terms of , the CDF of .
If is strictly decreasing, the inequality flips: .
When is not monotone (e.g., ), you need to split into cases and account for all regions of that satisfy .
Inverting CDFs to find distributions
Once you have , differentiate with respect to to get the PDF:
For a monotone, differentiable transformation with inverse , this yields the change-of-variables formula:
The absolute value accounts for both increasing and decreasing transformations. This single formula handles most one-variable continuous problems you'll encounter.
Moment-Generating Function Technique
Uniqueness of moment-generating functions
The MGF of a random variable is , defined for in some neighborhood of zero. The key property: if two random variables have the same MGF in a neighborhood of zero, they have the same distribution. This uniqueness theorem is what makes the MGF method work.
Finding distributions using MGFs
The strategy is to compute the MGF of the transformed variable and then recognize it as belonging to a known distribution family.
- Define and write .
- Evaluate this expectation using the distribution of .
- If the resulting expression matches the MGF of a known distribution (normal, gamma, Poisson, etc.), you've identified the distribution of .
Example: If and , then . This is the MGF of , confirming that a linear transformation of a normal is still normal.
The MGF method is especially powerful for sums of independent random variables, since when and are independent.
Convolutions of Independent Random Variables
Sums of independent random variables
When and are independent and you want the distribution of , the result is a convolution.
- Continuous case:
- Discrete case:
You're summing (or integrating) over all ways the two variables can combine to give the total . In practice, the MGF method is often faster for sums: compute and recognize the result.
Products of independent random variables
For the product of two independent continuous random variables, the PDF can be derived using a change of variables. One standard approach:
- Define and (an auxiliary variable).
- Compute the joint PDF of using the Jacobian.
- Integrate out to get the marginal PDF of .
Note: unlike sums, the MGF of a product is not simply the product of the individual MGFs. The factoring property applies only to sums.

Transformations of Multiple Random Variables
Joint cumulative distribution functions
For a vector of random variables , the joint CDF is:
When you apply a transformation , you can use the multivariate CDF method: express events about the in terms of the and use the joint distribution of .
Jacobian matrix for transformations
For an invertible transformation of continuous random variables, the multivariate change-of-variables formula is:
where is the Jacobian matrix with entries , and is expressed in terms of via the inverse transformation.
Equivalently, if you write the inverse transformation and define the Jacobian of the inverse, you get directly. Either way, the determinant corrects for how the transformation stretches or compresses volume in probability space.
Common Transformations and Distributions
Linear transformations
For (with ):
- The PDF transforms as:
Linear transformations preserve distribution families in many cases. Normals stay normal, and Cauchy random variables stay Cauchy, for instance.
Exponential and logarithmic transformations
Exponential: If , apply the CDF method. Since is strictly increasing:
A classic application: if , then follows a lognormal distribution.
Logarithmic: If for , then:
These transformations are useful for converting multiplicative relationships into additive ones.
Normal to standard normal transformation
Any normal random variable can be standardized:
This gives . The transformation lets you use standard normal tables or software to compute probabilities for any normal distribution. It's a special case of the linear transformation with and .
Chi-square and gamma distributions
If are independent variables, then:
The chi-square distribution with degrees of freedom is actually a special case of the gamma distribution: (using the rate parameterization).
More generally, the gamma family is closed under summation of independent variables: if are independent with the same rate , then . This is easy to verify using MGFs.
Applications of Transformations
Signal processing and filtering
In signal processing, random signals pass through systems (filters) that transform their distributions. If the input to a linear time-invariant system is a random process, the output distribution depends on the system's transfer function. Fourier and Laplace transforms are used to move between time and frequency domains, simplifying the analysis of how noise and signals interact.
Reliability analysis and failure rates
Reliability engineering models component lifetimes as random variables. The exponential distribution models constant failure rates (memoryless property), while the Weibull distribution handles increasing or decreasing failure rates. A logarithmic transformation of Weibull data linearizes the survival function, making it easier to estimate parameters from observed failure data.
Stochastic modeling in physics and engineering
Transformations underpin many physical models. Brownian motion (particle diffusion) involves Gaussian random variables whose distributions evolve over time. Birth-death processes use transformations to derive steady-state distributions. In each case, knowing how to transform distributions lets you move from a simple model to the quantities you actually care about, like hitting times, equilibrium concentrations, or system reliability.