๐ฒIntro to Probability Unit 7 โ Expectation and Variance of Random Variables
Expectation and variance are fundamental concepts in probability theory, providing insights into the average behavior and spread of random variables. These tools allow us to analyze and predict outcomes in uncertain situations, from simple coin tosses to complex financial models.
Understanding expectation and variance is crucial for making informed decisions in various fields. These concepts form the foundation for more advanced statistical techniques, helping us quantify risk, estimate probabilities, and draw meaningful conclusions from data in real-world applications.
Study Guides for Unit 7 โ Expectation and Variance of Random Variables
Random variable is a function that maps outcomes of a random experiment to real numbers
Expectation (mean) of a random variable X, denoted as E(X), is the average value of the variable over many trials
Variance of a random variable X, denoted as Var(X) or ฯ2, measures the average squared deviation from the mean
Formula for variance: Var(X)=E[(XโE(X))2]
Standard deviation ฯ is the square root of the variance and has the same units as the random variable
Moment generating function (MGF) of a random variable X is defined as MXโ(t)=E(etX)
MGF uniquely determines the distribution of a random variable
Probability mass function (PMF) for a discrete random variable X gives the probability of each possible value
Probability density function (PDF) for a continuous random variable X describes the relative likelihood of the variable taking on a specific value
Understanding Expectation
Expectation represents the long-run average value of a random variable over many independent trials
For a discrete random variable X with PMF p(x), the expectation is calculated as E(X)=โxโxโ p(x)
For a continuous random variable X with PDF f(x), the expectation is calculated as E(X)=โซโโโโxโ f(x)dx
Linearity of expectation states that for random variables X and Y and constants a and b, E(aX+bY)=aE(X)+bE(Y)
This property holds even if X and Y are dependent
Law of the unconscious statistician (LOTUS) allows calculating the expectation of a function g(X) of a random variable X as E(g(X))=โxโg(x)โ p(x) for discrete X or E(g(X))=โซโโโโg(x)โ f(x)dx for continuous X
Properties of Expectation
Expectation is a linear operator, meaning E(aX+bY)=aE(X)+bE(Y) for constants a and b
If X is a constant random variable with value c, then E(X)=c
For independent random variables X and Y, E(XY)=E(X)E(Y)
This property does not generally hold for dependent random variables
Expectation of a sum of random variables equals the sum of their individual expectations: E(โi=1nโXiโ)=โi=1nโE(Xiโ)
Monotonicity of expectation states that if XโคY for all outcomes, then E(X)โคE(Y)
Expectation of a non-negative random variable is always non-negative: if Xโฅ0, then E(X)โฅ0
Calculating Variance
Variance measures the average squared deviation of a random variable from its mean
Formula for variance: Var(X)=E[(XโE(X))2]
Expanded form: Var(X)=E(X2)โ[E(X)]2
For a discrete random variable X with PMF p(x), the variance is calculated as Var(X)=โxโ(xโE(X))2โ p(x)
For a continuous random variable X with PDF f(x), the variance is calculated as Var(X)=โซโโโโ(xโE(X))2โ f(x)dx
Properties of variance:
Var(aX+b)=a2Var(X) for constants a and b
For independent random variables X and Y, Var(X+Y)=Var(X)+Var(Y)
Standard deviation ฯ is the square root of the variance and has the same units as the random variable
Relationships Between Expectation and Variance
Variance can be expressed in terms of expectation: Var(X)=E(X2)โ[E(X)]2
Chebyshev's inequality relates expectation and variance to provide bounds on the probability of a random variable deviating from its mean
For any random variable X and positive constant k, P(โฃXโE(X)โฃโฅkฯ)โคk21โ
Markov's inequality provides an upper bound on the probability of a non-negative random variable exceeding a certain value
For a non-negative random variable X and constant a>0, P(Xโฅa)โคaE(X)โ
Jensen's inequality states that for a convex function g and random variable X, E(g(X))โฅg(E(X))
For a concave function g, the inequality is reversed: E(g(X))โคg(E(X))
Applications in Probability Problems
Expectation and variance are used to characterize the behavior of random variables and their distributions
In decision theory, expectation is used to calculate the expected value of different strategies or actions
Example: In a game with payoffs, the expected value of each strategy can be computed to determine the optimal choice
Variance and standard deviation are used to quantify risk and uncertainty in various fields (finance, insurance)
Example: Portfolio theory uses variance to measure the risk of investment portfolios
Moment generating functions (MGFs) are used to uniquely determine the distribution of a random variable and calculate its moments
The n-th moment of a random variable X is defined as E(Xn) and can be obtained by differentiating the MGF n times and evaluating at t=0
Expectation and variance are central to the study of limit theorems in probability, such as the law of large numbers and the central limit theorem
Common Distributions and Their Moments
Bernoulli distribution (single trial with binary outcome):
PMF: P(X=1)=p, P(X=0)=1โp
Expectation: E(X)=p
Variance: Var(X)=p(1โp)
Binomial distribution (number of successes in n independent Bernoulli trials):
PMF: P(X=k)=(knโ)pk(1โp)nโk
Expectation: E(X)=np
Variance: Var(X)=np(1โp)
Poisson distribution (number of events in a fixed interval):
PMF: P(X=k)=k!eโฮปฮปkโ
Expectation: E(X)=ฮป
Variance: Var(X)=ฮป
Normal (Gaussian) distribution:
PDF: f(x)=ฯ2ฯโ1โeโ2ฯ2(xโฮผ)2โ
Expectation: E(X)=ฮผ
Variance: Var(X)=ฯ2
Practice Problems and Examples
A fair six-sided die is rolled. Let X be the number shown on the die. Calculate the expectation and variance of X.
The time (in minutes) a customer spends in a store follows an exponential distribution with parameter ฮป=0.2. Find the expected time spent in the store and the variance of the time spent.
Solution:
For an exponential distribution with parameter ฮป, the expectation is E(X)=ฮป1โ and the variance is Var(X)=ฮป21โ.
E(X)=0.21โ=5 minutes
Var(X)=0.221โ=25 square minutes
Let X be a random variable with E(X)=2 and Var(X)=4. Find E(3Xโ5) and Var(3Xโ5).
Solution:
Using the linearity of expectation, E(3Xโ5)=3E(X)โ5=3โ 2โ5=1
Using the properties of variance, Var(3Xโ5)=32Var(X)=9โ 4=36
The number of customers arriving at a store follows a Poisson distribution with a mean of 10 per hour. Calculate the probability that more than 12 customers arrive in a given hour using Markov's inequality.
Solution:
Let X be the number of customers arriving in an hour. We want to find P(X>12).
By Markov's inequality, P(X>12)โค12E(X)โ=1210โโ0.833
This provides an upper bound on the probability, but the actual probability will be lower due to the Poisson distribution's properties.