## Given an event

Let $A$ be an event with $P(A)>0$. Given a discrete Random variable $Y$, the conditional expectation $Y$, given $A$, is expressed as

$E(Y∣A)=y∈Support(Y)∑ y⋅P(Y=y∣A)$If $Y$ is continuous, the expectation becomes an integral instead.

$E(Y∣A)=∫_{−∞}y⋅f_{Y}(y∣A)dy$Here, $f_{Y}(y∣A)$ is called the *conditional PDF* and is defined as the derivative of the conditional CDF $F_{Y}(y∣A)=P(Y≤y∣A)$.

## Given a random variable

Let $X$ and $Y$ be random variables. Let $g$ be a function that takes any real number $x∈Support(X)$ and produces $g(x)=E(Y∣X=x)$. Then the conditional expectation of $Y$, given $X$, is given by the *random variable* defined by $g(X)$.

The distinction between what is a *constant* and what is a *random variable* is important.

- $E(Y)$ remains a constant.
- $E(Y∣X=x)$ is also a constant.
- $E(Y∣X)$ is a
*random variable*. It is defined as $g(X)$, which is a transformation of $X$, a random variable (here, we don’t know that $X=x$ for some $x∈Support(X)$).

### Properties

- If $X$ and $Y$ are Independent, then $E(Y∣X)=E(Y)$.
- $E(h(X)⋅Y∣X)=h(X)⋅E(Y∣X)$ for any function $h$. Intuitively, we are given $X$, so we can treat $h(X)$ as a constant and move it out under expectation properties. In other words, $c=h(X)$ and $E(cY∣X)=c⋅E(Y∣X)$.
- $E(Y_{1}+Y_{2}∣X)=E(Y_{1}∣X)+E(Y_{2}∣X)$.