<meta http-equiv="refresh" content="1; url=/nojavascript/"> Sums and Differences of Independent Random Variables ( Study Aids ) | Probability | CK-12 Foundation
Dismiss
Skip Navigation

Sums and Differences of Independent Random Variables

%
Best Score
Practice Sums and Differences of Independent Random Variables...
Practice
Best Score
%
Practice Now
Transforming Random Variables
 0  0  0

When you add or subtract a constant from the outcomes of a random variable, the mean will increase or decrease by that amount, and the standard deviation will stay the same.

Example:  This distribution is called X.

\mu_X &=3 \\\sigma_X &=1

If we add 2 to the distribution, it becomes X + 2.  Now the distribution has a mean of 5, but the standard deviation is still 1:

\mu_{X+2} &=5 \\\sigma_{X+2} &=1

When you multiply or divide the outcomes of a random variable by a constant, the mean and the standard deviation will also be multiplied or divided by that constant.

Example: This distribution is called Y.
\mu_Y=3 
\sigma_Y=1 

Let’s see what happens if we multiply each outcome by 3, and find \mu_{3Y} and \sigma_{3Y} :

 \mu_{3Y} = 3 \times 3=9 

\sigma_{3Y}=1 \times 3=3 

Image Attributions

Reviews

Email Verified
Well done! You've successfully verified the email address .
OK
Please wait...
Please wait...
ShareThis Copy and Paste

Original text