While Expectation can be a useful measurement of a r.v., it doesn’t tell us much info about how “spread out” the distribution of all the values of $X$ are. If we want a measure of how much $X$ tends to *deviate* from its expectation, we can use the variance.

**Variance** of a r.v. $X$ is given by

Another useful, equivalent representation is

$Var(X)=E(X_{2})−E(X)_{2}$which is used more often in practice, because we just need to calculate the expectation and then use something like LOTUS to get $E(X_{2})$.

## Properties of variance

$Var(X)≥0$ and in fact, $Var(X)>0$ unless $X$ is a constant (its support size is 1). This is a good sanity check to see if you’re on the right path.

$Var(X+c)=Var(X)$ for any constant $c$. This makes sense: we are shifting every support by a constant amount. This will probably affect $E(X)$, but the variance remains the same.

$Var(cX)=c_{2}⋅Var(X)$ for any constant $c$. For example, if we want to convert $X$ from hours to minutes, we have to multiple variance by $3600$, but we only multiply expectation and standard deviation by $60$.

### Variance of the sum of two variables

The general formula is always true assuming that the expectation is well defined:

$Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y)$#### Corollaries

Two more results follow immediately from the general formula:

- $Var(X−Y)=Var(X)+Var(Y)−2Cov(X,Y)$
- If $X$ and $Y$ are Independent, then $Var(X+Y)=Var(X)+Var(Y)$. However, the equation being true does not imply independence.

## Standard deviation

For $X$ this is given by

$SD(X)=Var(X) $In most applications, it is easier to interpret the standard deviation than the variance of a random variable.