Independent random variables are two or more variables that have no influence on each other. The outcome of one variable does not affect the outcome of the other variable.
Think of independent random variables like two separate dice rolls. The result of rolling one die has no impact on the result of rolling the other die.
Joint Probability Distribution: This term refers to the probability distribution that shows the probabilities for all possible outcomes of multiple random variables.
Covariance: Covariance measures how much two random variables vary together. If they are independent, their covariance is zero.
Expected Value: The expected value is a measure of central tendency for a random variable and represents its long-term average value. For independent random variables, the expected value can be calculated by summing up their individual expected values.
What is the mean of the sum of two independent random variables equal to?
What is the standard deviation of the sum of two independent random variables equal to?
If X and Y are two independent random variables and Z = X + Y, what is the mean of Z?
If X and Y are two independent random variables, and Z is their sum, what is the standard deviation of Z?
If X and Y are two independent random variables, and Z = X - Y, what is the variance of Z?
A and B are independent random variables. If A has a mean of 30 and a standard deviation of 12, and Y has a mean of 18 and a standard deviation of 5, what is the standard deviation of X - Y?
X and Y are independent random variables. If X has a mean of 20 and a standard deviation of 4, and Y has a mean of 10 and a standard deviation of 3, what is the standard deviation of X + Y?
Study guides for the entire semester
200k practice questions
Glossary of 50k key terms - memorize important vocab
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.