Combining Random Variables

Carson West

AP Stats Home

Combining Random Variables

Understanding how to combine random variables is crucial for analyzing situations involving multiple sources of variation. This topic builds upon Introduction to Random Variables and Probability Distributions and Mean and Standard Deviation of Random Variables. When we combine random variables, we are often interested in the mean and standard deviation of the resulting new random variable.

Combining Means of Random Variables

The mean (or expected value) of a sum or difference of random variables is straightforward and follows simple arithmetic, regardless of whether the variables are independent or not.

If $ X $ and $ Y $ are random variables, then:

Combining Variances of Random Variables

Combining variances is slightly more complex than combining means, as it requires the assumption of independence between the random variables.

Independent Random Variables

Two random variables are independent if the outcome of one does not affect the outcome of the other. This condition is critical for combining variances.

Combining Standard Deviations

Since standard deviation is the square root of variance, you cannot simply add or subtract standard deviations. You must convert them to variances, combine the variances, and then take the square root to find the new standard deviation.

Summary Table

Let $ X $ and $ Y $ be random variables, and $ a, b $ be constants.

Operation Mean $ E(\cdot) $ Variance $ \text{Var}(\cdot) $ (if $ X,Y $ are independent) Standard Deviation $ \sigma(\cdot) $ (if $ X,Y $ are independent)
$ X+Y $ $ E(X) + E(Y) $ $ \text{Var}(X) + \text{Var}(Y) $ $ \sqrt{\sigma_X^2 + \sigma_Y^2} $
$ X-Y $ $ E(X) - E(Y) $ $ \text{Var}(X) + \text{Var}(Y) $ $ \sqrt{\sigma_X^2 + \sigma_Y^2} $
$ aX+b $ $ aE(X) + b $ $ a^2\text{Var}(X) $ $
$ X_1 + X_2 + \dots + X_n $ (all independent) $ E(X_1) + \dots + E(X_n) $ $ \text{Var}(X_1) + \dots + \text{Var}(X_n) $ $ \sqrt{\text{Var}(X_1) + \dots + \text{Var}(X_n)} $

Shape of the Distribution

If the original random variables $ X $ and $ Y $ are The Normal Distribution|Normally distributed, then any linear combination of $ X $ and $ Y $ (e.g., $ X+Y $ , $ X-Y $ , $ aX+bY $ ) will also be Normally distributed. This property is very useful for performing inferential procedures later on. If the variables are not Normally distributed, the shape of their sum or difference may become approximately Normal, especially if there are many variables, due to the The Central Limit Theorem.