Given a Sample mean $X_{n}$, how close is $X_{n}$ to the real mean $μ$?

It turns out that as $n$ increases, $X_{n}$ will increasingly get closer to $μ$, and the distribution of $X_{n}$ gets closer to Normal.

## Weak law of large numbers (WLLN)

Given a sample mean $X_{n}$ with finite Expectation $μ$, for some $ϵ>0$, we have

$n→∞lim P(∣X_{n}−μ∣<ϵ)=1$The probability that $X_{n}$ gets close to $μ$ goes to 100% when $n$ approaches infinity, no matter what definition of $ϵ$ we choose, whether 0.00001 or 1000.

- We say that $X_{n}$
*converges in probability*to $μ$. - WLLN holds for both finite and infinite variance (but is much harder to prove for the latter).

### Proof of correctness

From Finding the variance, we know that $Var(X_{n})=nσ_{2} $, where $σ_{2}=Var(X_{i})$ for any $i∈[1..n]$. So $SD(X_{n})=Var(X_{n}) =n σ $.

Given the derivations above and a constant $ϵ>0$, Chebyshev’s inequality tells us

$0≤P(∣X_{n}−μ∣≥ϵ)≤ϵ_{2}σ_{2}/n =nϵ_{2}σ_{2} $The $0≤$ is not part of the original inequality, but it must be true because probabilities are not negative.

We can now analyze this:

- As $n→∞$, the denominator goes to infinity and the overall probability goes to zero. This is the probability that $∣X_{n}−μ∣$ is greater than $ϵ$.
- This means that $P(∣X_{n}−μ∣<ϵ)$ approaches a probability of 100%.

## Strong law of numbers (SLLN)

An even stronger claim can be made about WLLN. In particular, the following claim is true:

$P(n→∞lim X_{n}=μ)=1$which completely eliminates the need for an $ϵ$ and directly states that as the sample size approaches infinity, the sample mean will equal the population mean with 100% certainty.

- We say that $X_{n}$
*converges almost surely*(with probability 1) to $μ$.