User Tools

Site Tools


Rules for Variance

  1. The variance of a constant is zero.
    $\Var(c) = 0 $
  2. Adding a constant value, c to a variable does not change variance (because the expectation increases by the same amount).
    $ \sigma_{x+c} = Var(X+c) = E[((X_{i} + c)-E(\overline{X} + c))^{2}] = Var(X) $
  3. Multiplying a constant value, c to a variable increase the variance by square of the constant, c.
    $ \sigma_{c*x} = Var(cX) = c^{2}Var(X)$
  4. The variance of the sum of two or more random variables is equal to the sum of each of their variances only when the random variables are independent.
    $ Var(X+Y) = Var(X) + 2 Cov(X,Y) + Var(Y)$
    and $ Cov(X,Y) = 0 $

Rules for the Covariance

  1. The covariance of two constants, c and k, is zero.
    $Cov(c,k) = E[(c-E(c))(k-E(k)] = E[(0)(0)] = 0$
  2. The covariance of two independent random variables is zero.
    $Cov(X, Y) = 0$ When X and Y are independent.
  3. The covariance is a combinative as is obvious from the definition.
    $Cov(X, Y) = Cov(Y, X)$
  4. The covariance of a random variable with a constant is zero.
    $Cov(X, c) = 0 $
  5. Adding a constant to either or both random variables does not change their covariances.
    $Cov(X+c, Y+k) = Cov(X, Y)$
  6. Multiplying a random variable by a constant multiplies the covariance by that constant.
    $Cov(cX, kY) = c*k \: Cov(X, Y)$
  7. The additive law of covariance holds that the covariance of a random variable with a sum of random variables is just the sum of the covariances with each of the random variables.
    $Cov(X+Y, Z) = Cov(X, Z) + Cov(Y, Z)$
  8. The covariance of a variable with itself is the variance of the random variable.
    $Cov(X, X) = Var(X) $
statistical_review.txt · Last modified: 2017/12/11 09:16 by hkimscil