Fiveable

🎲Intro to Probabilistic Methods Unit 6 Review

QR code for Intro to Probabilistic Methods practice questions

6.2 Moment-generating functions and characteristic functions

6.2 Moment-generating functions and characteristic functions

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🎲Intro to Probabilistic Methods
Unit & Topic Study Guides

Moment-generating functions and characteristic functions are powerful tools for analyzing random variables. They help us understand a variable's distribution, calculate moments, and work with sums of independent variables. These functions are especially useful when dealing with complex probability distributions.

These concepts build on earlier topics in the chapter about functions of random variables. They provide alternative ways to describe and analyze random variables, offering insights that may not be immediately apparent from the probability distribution alone.

Moment-generating functions

Definition and computation

  • The moment-generating function (MGF) of a random variable X is defined as MX(t)=E[etX]M_X(t) = E[e^{tX}], where E[]E[·] denotes the expected value and tt is a real number
  • For a discrete random variable X with probability mass function p(x)p(x), the MGF is given by MX(t)=xetxp(x)M_X(t) = \Σ_x e^{tx} p(x), where the sum is taken over all possible values of X (coin flips, die rolls)
  • For a continuous random variable X with probability density function f(x)f(x), the MGF is given by MX(t)=etxf(x)dxM_X(t) = \int_{-\infty}^{\infty} e^{tx} f(x) dx, where the integral is taken over the entire range of X (heights, weights)
  • The MGF of a random variable uniquely determines its probability distribution, provided that the MGF exists in a neighborhood around t=0t = 0

Moments and cumulants

  • The n-th moment of a random variable X is defined as E[Xn]E[X^n] and can be obtained by differentiating the MGF n times and evaluating at t=0t = 0: E[Xn]=MX(n)(0)E[X^n] = M_X^{(n)}(0), where MX(n)(t)M_X^{(n)}(t) denotes the n-th derivative of MX(t)M_X(t) with respect to tt
    • The first moment, E[X]E[X], is the mean of the random variable
    • The second moment, E[X2]E[X^2], is related to the variance by Var(X)=E[X2](E[X])2Var(X) = E[X^2] - (E[X])^2
  • The n-th cumulant of a random variable X, denoted by κn\kappa_n, is defined as the n-th derivative of the logarithm of the MGF evaluated at t=0t = 0: κn=(dn/dtn)log(MX(t))t=0\kappa_n = (d^n/dt^n) \log(M_X(t)) |_{t=0}
    • Cumulants are related to moments, with the first cumulant being the mean and the second cumulant being the variance
    • Higher-order cumulants provide information about the shape of the distribution, such as skewness (κ3\kappa_3) and kurtosis (κ4\kappa_4)

Applications of moment-generating functions

Definition and computation, Generalized Moment Generating Functions of Random Variables and Their Probability Density Functions

Uniqueness and distribution identification

  • The MGF of a random variable uniquely determines its probability distribution, provided that the MGF exists in a neighborhood around t=0t = 0
    • If two random variables have the same MGF, they must have the same probability distribution
  • In some cases, the MGF may be easily recognizable as belonging to a known distribution, allowing for immediate identification of the probability distribution without explicit derivation (exponential, normal)

Deriving probability distributions

  • To derive the probability distribution from the MGF, one can use techniques such as partial fraction decomposition or power series expansion
    • For discrete random variables, the probability mass function can be obtained by expanding the MGF as a power series and identifying the coefficients of the series as the probabilities
    • For continuous random variables, the probability density function can be obtained by inverting the MGF using techniques like the inverse Laplace transform or the Bromwich integral

Properties of characteristic functions

Definition and computation, Generalized Moment Generating Functions of Random Variables and Their Probability Density Functions

Definition and existence

  • The characteristic function (CF) of a random variable X is defined as ϕX(t)=E[eitX]\phi_X(t) = E[e^{itX}], where ii is the imaginary unit and tt is a real number
  • The CF always exists for any random variable, unlike the MGF, which may not exist for some distributions (Cauchy distribution)

Properties and applications

  • The CF uniquely determines the probability distribution of a random variable, and the probability density function (if it exists) can be obtained by taking the inverse Fourier transform of the CF
  • CFs have several properties:
    • ϕX(0)=1\phi_X(0) = 1
    • ϕX(t)1|\phi_X(t)| \leq 1
    • ϕX(t)=ϕX(t)\phi_X(-t) = \phi_X^*(t), where * denotes the complex conjugate
  • CFs are particularly useful for studying the sum of independent random variables, as the CF of the sum is the product of the individual CFs

Distribution derivation using functions

Moment-generating functions

  • To derive the probability distribution from the MGF, one can use techniques such as partial fraction decomposition or power series expansion
    • For discrete random variables, the probability mass function can be obtained by expanding the MGF as a power series and identifying the coefficients of the series as the probabilities (Poisson distribution)
    • For continuous random variables, the probability density function can be obtained by inverting the MGF using techniques like the inverse Laplace transform or the Bromwich integral (gamma distribution)

Characteristic functions

  • To derive the probability distribution from the CF, one can use the inverse Fourier transform
    • For continuous random variables, the probability density function can be obtained by taking the inverse Fourier transform of the CF: f(x)=(1/2π)eitxϕX(t)dtf(x) = (1/2\pi) \int_{-\infty}^{\infty} e^{-itx} \phi_X(t) dt
    • This technique is particularly useful for distributions that do not have a closed-form expression for their probability density function (stable distributions)
Pep mascot
Upgrade your Fiveable account to print any study guide

Download study guides as beautiful PDFs See example

Print or share PDFs with your students

Always prints our latest, updated content

Mark up and annotate as you study

Click below to go to billing portal → update your plan → choose Yearly → and select "Fiveable Share Plan". Only pay the difference

Plan is open to all students, teachers, parents, etc
Pep mascot
Upgrade your Fiveable account to export vocabulary

Download study guides as beautiful PDFs See example

Print or share PDFs with your students

Always prints our latest, updated content

Mark up and annotate as you study

Plan is open to all students, teachers, parents, etc
report an error
description

screenshots help us find and fix the issue faster (optional)

add screenshot

2,589 studying →