Subjects and resources that you bookmark will appear here.
Kanya Shah
This section is about transforming random variables by adding/subtracting or multiplying/dividing by a constant. Make sure you know how to combine random variables to calculate and interpret the mean and standard deviation.
Y = a + BX
Adding/Subtracting by a constant affects measures of center and location but does NOT affect variability or the shape of a distribution. Multiplying or dividing by a constant affects center, location, and variability measures but won’t change the shape of a distribution.
Sum: For any two random variables X and Y, if S = X + Y, the mean of S is meanS= meanX + meanY. Put simply, the mean of the sum of two random variables is equal to the sum of their means.
Difference: For any two random variables X and Y, if D = X - Y, the mean of D is meanD= meanX - meanY. The mean of the difference of two random variables is equal to the difference of their means. The order of subtraction is important.
Independent Random Variables: If knowing the value of X does not help us predict the value of X, then X and Y are independent random variables.
Sum: For any two independent random variables X and Y, if S = X + Y, the variance of S is SD^2= (X+Y)^2 . To find the standard deviation, take the square root of the variance formula: SD = sqrt(SDX^2 + SDY^2).
Standard deviations do not add; use the formula or your calculator.
Difference: For any two independent random variables X and Y, if D = X - Y, the variance of D is D^2= (X-Y)^2=x2+Y2. To find the standard deviation, take the square root of the variance formula: D=sqrt(x2+Y2). Notice that you are NOT subtracting the variances (or the standard deviation in the latter formula).
🎥Watch: AP Stats - Combining Random Variables
Browse Study Guides By Unit
Practice your typing skills while reading Combining Random Variables
Start Game