When does the MGF of A-C exist?

On Math stack exchange, purpleostrich asked “Consider random variables A, B, and C. We know that A = B + C. We also know that A and C have an MGF. Is it the case that B must have a MGF?”

Here is my answer:

You Can’t Compute the MGF

In general, you can’t compute the MGF of $B$ if you only know the MGFs of $A$ and $C$. For example, consider two possible joint distributions of $A$ and $C$:

Case 1: P( A=0 and C=0) = 1/2 and P(A=1 and C=1)=1/2. In this case, the MGFs of A and C are $(1+\exp(t))/2$ and the MGF of B is 1.

Case 2: P( A=0 and C=1) = 1/2 and P(A=1 and C=0)=1/2. In this case, the MGFs of A and C are $(1+\exp(t))/2$ and the MGF of B is $\frac{\exp(-t)+\exp(t)}2$.

Notice that in both Case 1 and Case 2 the MGFs for $A$ and $C$ were $(1+exp(t))/2$, but the MGF for $B$ changed from Case 1 to Case 2.

 

You can prove the MGF exists

Although you can’t computer the MGF of $B$, you can prove that $M_B(t)$ exists for $t\in D=\frac12 (Dom(M_A)\cap (-Dom(M_C))$. Suppose $t\in D$. Then $||\exp(ta)||_1<\infty$ and $||\exp(-tc)||_1<\infty$ where $||g||_p=\left(\int\int |g(a,c)|^p\; f(a,c)\; da\; dc\right)^{1/p}$ is the $L_p$-norm of $g$ over the joint probability space and $f(a,c)$ is the joint pdf of $A$ and $C$. That implies $||\exp(ta/2)||_2 < \infty$ and $||\exp(-tc/2)||_2 < \infty$. By the Hölder’s inequality or, more specifically, Schwarz inequality, $||\exp(ta)\exp(-tc)||_1<\infty$. But, $||\exp(ta)\exp(-tc)||_1= ||\exp(t(a-c)||_1= E[\exp(tB)]=M_B(t).$ This proves that $M_B(t)$ exists for $t\in D$.

 

If A and C are independent

If $A$ and $C$ are independent and $B = A-C$, then it must be the case that
$$
M_B(t) = M_A(t)\cdot M_C(-t)
$$
whenever $t\in Dom(M_A)\cap(-Dom(M_C))$ (see e.g. Wikipedia). Here is a rough proof.

If $t\in Dom(M_A)\cap(-Dom(M_C))$, then
$$M_A(t)\cdot M_C(-t) = \int_{a=-\infty}^\infty \exp(t a) dF_A(a) \cdot \int_{c=-\infty}^\infty \exp(-t c) dF_C(c)$$
$$
= \int_{a=-\infty}^\infty \int_{c=-\infty}^\infty \exp(t (a-c)) dF_A(a) dF_C(c)
$$
$$
= \int_{b=-\infty}^\infty \exp(t b) dF_B(b) = M_B(t)
$$
where $F_A, F_B$, and $F_C$ are the cumulative distribution functions of $A, B$, and $C$ respectively.