Intro to Probability

🎲Intro to Probability Unit 7 – Expectation and Variance of Random Variables

Expectation and variance are fundamental concepts in probability theory, providing insights into the average behavior and spread of random variables. These tools allow us to analyze and predict outcomes in uncertain situations, from simple coin tosses to complex financial models. Understanding expectation and variance is crucial for making informed decisions in various fields. These concepts form the foundation for more advanced statistical techniques, helping us quantify risk, estimate probabilities, and draw meaningful conclusions from data in real-world applications.

Key Concepts and Definitions

  • Random variable is a function that maps outcomes of a random experiment to real numbers
  • Expectation (mean) of a random variable XX, denoted as E(X)E(X), is the average value of the variable over many trials
  • Variance of a random variable XX, denoted as Var(X)Var(X) or σ2\sigma^2, measures the average squared deviation from the mean
    • Formula for variance: Var(X)=E[(XE(X))2]Var(X) = E[(X - E(X))^2]
  • Standard deviation σ\sigma is the square root of the variance and has the same units as the random variable
  • Moment generating function (MGF) of a random variable XX is defined as MX(t)=E(etX)M_X(t) = E(e^{tX})
    • MGF uniquely determines the distribution of a random variable
  • Probability mass function (PMF) for a discrete random variable XX gives the probability of each possible value
  • Probability density function (PDF) for a continuous random variable XX describes the relative likelihood of the variable taking on a specific value

Understanding Expectation

  • Expectation represents the long-run average value of a random variable over many independent trials
  • For a discrete random variable XX with PMF p(x)p(x), the expectation is calculated as E(X)=xxp(x)E(X) = \sum_{x} x \cdot p(x)
  • For a continuous random variable XX with PDF f(x)f(x), the expectation is calculated as E(X)=xf(x)dxE(X) = \int_{-\infty}^{\infty} x \cdot f(x) dx
  • Linearity of expectation states that for random variables XX and YY and constants aa and bb, E(aX+bY)=aE(X)+bE(Y)E(aX + bY) = aE(X) + bE(Y)
    • This property holds even if XX and YY are dependent
  • Law of the unconscious statistician (LOTUS) allows calculating the expectation of a function g(X)g(X) of a random variable XX as E(g(X))=xg(x)p(x)E(g(X)) = \sum_{x} g(x) \cdot p(x) for discrete XX or E(g(X))=g(x)f(x)dxE(g(X)) = \int_{-\infty}^{\infty} g(x) \cdot f(x) dx for continuous XX

Properties of Expectation

  • Expectation is a linear operator, meaning E(aX+bY)=aE(X)+bE(Y)E(aX + bY) = aE(X) + bE(Y) for constants aa and bb
  • If XX is a constant random variable with value cc, then E(X)=cE(X) = c
  • For independent random variables XX and YY, E(XY)=E(X)E(Y)E(XY) = E(X)E(Y)
    • This property does not generally hold for dependent random variables
  • Expectation of a sum of random variables equals the sum of their individual expectations: E(i=1nXi)=i=1nE(Xi)E(\sum_{i=1}^{n} X_i) = \sum_{i=1}^{n} E(X_i)
  • Monotonicity of expectation states that if XYX \leq Y for all outcomes, then E(X)E(Y)E(X) \leq E(Y)
  • Expectation of a non-negative random variable is always non-negative: if X0X \geq 0, then E(X)0E(X) \geq 0

Calculating Variance

  • Variance measures the average squared deviation of a random variable from its mean
  • Formula for variance: Var(X)=E[(XE(X))2]Var(X) = E[(X - E(X))^2]
    • Expanded form: Var(X)=E(X2)[E(X)]2Var(X) = E(X^2) - [E(X)]^2
  • For a discrete random variable XX with PMF p(x)p(x), the variance is calculated as Var(X)=x(xE(X))2p(x)Var(X) = \sum_{x} (x - E(X))^2 \cdot p(x)
  • For a continuous random variable XX with PDF f(x)f(x), the variance is calculated as Var(X)=(xE(X))2f(x)dxVar(X) = \int_{-\infty}^{\infty} (x - E(X))^2 \cdot f(x) dx
  • Properties of variance:
    • Var(aX+b)=a2Var(X)Var(aX + b) = a^2Var(X) for constants aa and bb
    • For independent random variables XX and YY, Var(X+Y)=Var(X)+Var(Y)Var(X + Y) = Var(X) + Var(Y)
  • Standard deviation σ\sigma is the square root of the variance and has the same units as the random variable

Relationships Between Expectation and Variance

  • Variance can be expressed in terms of expectation: Var(X)=E(X2)[E(X)]2Var(X) = E(X^2) - [E(X)]^2
  • Chebyshev's inequality relates expectation and variance to provide bounds on the probability of a random variable deviating from its mean
    • For any random variable XX and positive constant kk, P(XE(X)kσ)1k2P(|X - E(X)| \geq k\sigma) \leq \frac{1}{k^2}
  • Markov's inequality provides an upper bound on the probability of a non-negative random variable exceeding a certain value
    • For a non-negative random variable XX and constant a>0a > 0, P(Xa)E(X)aP(X \geq a) \leq \frac{E(X)}{a}
  • Jensen's inequality states that for a convex function gg and random variable XX, E(g(X))g(E(X))E(g(X)) \geq g(E(X))
    • For a concave function gg, the inequality is reversed: E(g(X))g(E(X))E(g(X)) \leq g(E(X))

Applications in Probability Problems

  • Expectation and variance are used to characterize the behavior of random variables and their distributions
  • In decision theory, expectation is used to calculate the expected value of different strategies or actions
    • Example: In a game with payoffs, the expected value of each strategy can be computed to determine the optimal choice
  • Variance and standard deviation are used to quantify risk and uncertainty in various fields (finance, insurance)
    • Example: Portfolio theory uses variance to measure the risk of investment portfolios
  • Moment generating functions (MGFs) are used to uniquely determine the distribution of a random variable and calculate its moments
    • The nn-th moment of a random variable XX is defined as E(Xn)E(X^n) and can be obtained by differentiating the MGF nn times and evaluating at t=0t=0
  • Expectation and variance are central to the study of limit theorems in probability, such as the law of large numbers and the central limit theorem

Common Distributions and Their Moments

  • Bernoulli distribution (single trial with binary outcome):
    • PMF: P(X=1)=pP(X = 1) = p, P(X=0)=1pP(X = 0) = 1 - p
    • Expectation: E(X)=pE(X) = p
    • Variance: Var(X)=p(1p)Var(X) = p(1 - p)
  • Binomial distribution (number of successes in nn independent Bernoulli trials):
    • PMF: P(X=k)=(nk)pk(1p)nkP(X = k) = \binom{n}{k} p^k (1-p)^{n-k}
    • Expectation: E(X)=npE(X) = np
    • Variance: Var(X)=np(1p)Var(X) = np(1-p)
  • Poisson distribution (number of events in a fixed interval):
    • PMF: P(X=k)=eλλkk!P(X = k) = \frac{e^{-\lambda}\lambda^k}{k!}
    • Expectation: E(X)=λE(X) = \lambda
    • Variance: Var(X)=λVar(X) = \lambda
  • Normal (Gaussian) distribution:
    • PDF: f(x)=1σ2πe(xμ)22σ2f(x) = \frac{1}{\sigma\sqrt{2\pi}}e^{-\frac{(x-\mu)^2}{2\sigma^2}}
    • Expectation: E(X)=μE(X) = \mu
    • Variance: Var(X)=σ2Var(X) = \sigma^2

Practice Problems and Examples

  1. A fair six-sided die is rolled. Let XX be the number shown on the die. Calculate the expectation and variance of XX.

    • Solution:
      • E(X)=x=16xP(X=x)=1+2+3+4+5+66=3.5E(X) = \sum_{x=1}^{6} x \cdot P(X = x) = \frac{1+2+3+4+5+6}{6} = 3.5
      • Var(X)=E(X2)[E(X)]2=12+22+32+42+52+6263.52=2.92Var(X) = E(X^2) - [E(X)]^2 = \frac{1^2+2^2+3^2+4^2+5^2+6^2}{6} - 3.5^2 = 2.92
  2. The time (in minutes) a customer spends in a store follows an exponential distribution with parameter λ=0.2\lambda = 0.2. Find the expected time spent in the store and the variance of the time spent.

    • Solution:
      • For an exponential distribution with parameter λ\lambda, the expectation is E(X)=1λE(X) = \frac{1}{\lambda} and the variance is Var(X)=1λ2Var(X) = \frac{1}{\lambda^2}.
      • E(X)=10.2=5E(X) = \frac{1}{0.2} = 5 minutes
      • Var(X)=10.22=25Var(X) = \frac{1}{0.2^2} = 25 square minutes
  3. Let XX be a random variable with E(X)=2E(X) = 2 and Var(X)=4Var(X) = 4. Find E(3X5)E(3X - 5) and Var(3X5)Var(3X - 5).

    • Solution:
      • Using the linearity of expectation, E(3X5)=3E(X)5=325=1E(3X - 5) = 3E(X) - 5 = 3 \cdot 2 - 5 = 1
      • Using the properties of variance, Var(3X5)=32Var(X)=94=36Var(3X - 5) = 3^2Var(X) = 9 \cdot 4 = 36
  4. The number of customers arriving at a store follows a Poisson distribution with a mean of 10 per hour. Calculate the probability that more than 12 customers arrive in a given hour using Markov's inequality.

    • Solution:
      • Let XX be the number of customers arriving in an hour. We want to find P(X>12)P(X > 12).
      • By Markov's inequality, P(X>12)E(X)12=10120.833P(X > 12) \leq \frac{E(X)}{12} = \frac{10}{12} \approx 0.833
      • This provides an upper bound on the probability, but the actual probability will be lower due to the Poisson distribution's properties.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.