The formula for the variance of the difference between the two variables memory span in this example is shown below. Notice that the expression for the difference is the same as the formula for the sum.
These formulas for the sum and difference of variables given above only apply when the variables are independent. In this example, we have thousands of randomly-paired scores.
Since the scores are paired randomly, there is no relationship between the memory span of one member of the pair and the memory span of the other. To calculate the variance of z, we use the deviations of z, which are: and which gives us the formula for the variance of z, exactly as defined previously , using the sum of squared deviations of z: We can expand this formula: then we do the summation over j: In this expression, the middle term includs the sum of deviations of y.
Since the sum of deviations is always zero here , we can remove that middle term completely: We can simplify to this: which is equivalent to this: QED.
Adding Dependent Variables If two variables are related, then the rule that you add variances does not apply. In this section, we will develop a new rule. Adding a constant value, c, to each term increases the mean, or expected value, by the constant. Multiplying a random variable by a constant value, c, multiplies the expected value or mean by that constant. The expected value or mean of the sum of two random variables is the sum of the means. This is also known as the additive law of expectation.
The variance of a constant is zero. Adding a constant value, c, to a random variable does not change the variance, because the expectation mean increases by the same amount. Multiplying a random variable by a constant increases the variance by the square of the constant. The variance of the sum of two or more random variables is equal to the sum of each of their variances only when the random variables are independent.
The covariance of two constants, c and k, is zero. The covariance of two independent random variables is zero. The covariance is a combinative as is obvious from the definition. The other way around, variance is the square of SD.
So: - You square the individual SD's to get the variances - Then you add these together to get the total variance - Then you take the square root to get the total SD. This works for any number of independent variables mark the bold type for independent! Key Questions If you add two independent random variables, what is the standard deviation of the combined distribution, if the standard deviations of the two original distributions were, for example, 7 and 5? What is the procedure for calculating the new standard deviation for two combined random variables, if the random variables X and Y are not independent?
Unless you know the "rules" of their dependency, you can't. If you add two independent random variables, what is the standard deviation of the combined distribution, if the standard deviations of the two original distributions were, for example, 7 and 5? If you multiply each entry of a set by 3 and added 1, how would the mean and standard deviation change?
If I know the mean, standard deviation, and size of sample A and sample B, how do I compute the standard deviation of the union of samples A and B?
0コメント