When you **add or subtract** a constant from the outcomes of a random variable, the mean will increase or decrease by that amount, and the standard deviation will stay the same.

Example: This distribution is called X.

If we add 2 to the distribution, it becomes X + 2. Now the distribution has a mean of 5, but the standard deviation is still 1:

When you **multiply or divide** the outcomes of a random variable by a constant, the mean *and the standard deviation* will also be multiplied or divided by that constant.

Example: This distribution is called Y .

Let’s see what happens if we multiply each outcome by 3, and find