The **moment generating function** (MGF) of a r.v. $X$ is a function of $t$. It’s defined as $M:R→R$.

Note that whatever we choose for $t$ might cause the expectation to blow up (approach $∞$). This is fine, as long as $E(e_{tX})$ is finite for all $t$ in *some open interval* $(−a,a)$ around zero. Otherwise, the MGF of $X$ does not exist.

Here, $e_{tX}$ is a random variable that depends on $t$ and the original random variable $X$.

If the MGF exists, then when $t=0$ then $M_{X}(0)=E(1)=1$. There’s no intuitive explanation for what $t$ really is. LOL.

## Usefulness

There are three theorems involving MGFs that we’ll use.

- If $M_{X}$ exists, then $E(X_{n})$ equals the $n$th derivative of the MGF, evaluated at zero. That is, we have $E(X_{n})=M_{(n)}(0)$ for $n=1,2,3,…$

For a not very rigorous proof, we can consider the Taylor series, replace the $x$ in $e_{x}$ with $e_{tX}$, and see that every term corresponds to a derivative of the moment.

Example

Suppose $X∼Expo(1)$. We can then derive $M_{X}$, which turns out to be $M_{X}(t)=1−t1 =(1−t)_{−1}$.

Then, to find the Variance of $X$, we simply have

$Var(X)=E(X_{2})−E(X)_{2}=M_{X}(0)−M_{X}(0)=2−1=1$Finding the derivative of the moment is much easier than using LOTUS.

- If $X$ and $Y$ have the same MGF, then they must also have the same distribution.
- If $X$ and $Y$ are Independent and both of their MGFs exist, then the MGF of $X+Y$ is the
*product*of their MGFs. We have:

## Location-scale transformation

If $X$ has a MGF of $M_{X}(t)$, then the MGF of $a+bX$ is given by

$M_{a+bX}(t)=E(e_{t(a+bX)})=e_{at}⋅E(e_{btX})=e_{at}⋅M_{X}(bt)$Since $b$ is a constant, we can basically pretend that we’re transforming $t$. If we let $u=bt$, then the above is the same as $M_{X}(u)=E(e_{uX})$, which is the definition of the MGF.

This can be useful for finding the MGF of a general continuous distribution after we derive the MGF of the standard version of the distribution (e.g. uniform, exponential, Normal).

## Solving the MGF for a distribution

When we finish deriving a MGF for a distribution, we can case on what values $t$ can take on, and see where we define our neighborhood $(−a,a)$.

- We can pick the cutoff $a$ so that $M_{X}$ is well defined.
- Remember, as long as such an interval exists, then the MGF of $X$ can exist.

### Discrete variable

To solve the MGF, begin with the definition of expectation. We can use LOTUS because $e_{tX}$ is really a transformation of $X$, with $g(X)=e_{tX}$.

$M_{X}(t)=E(e_{tX})=x∈Support(X)∑ e_{tx}P(X=x)$where $Support(X)$ denotes us iterating over all the possible values of the random variable $X$. We substitute $P(X=x)$ with whatever relevant PMF.

### Continuous variable

Instead of a summation, we use an integral instead over $(−∞,∞)$. Remember that the PDF is often nonzero outside of a specific range, so the integral simplifies. We’re also using LOTUS here as well.

$M_{X}(t)=E(e_{tX})=∫_{−∞}e_{tx}f(x)dx$