# Examples of mixtures

The previous post introduced the notion of mixtures. A random variable $X$ is a mixture if its distribution function is a weighted average of a family of distribution functions. In this post, we present two more examples, one discrete and one continuous.

Comment
In the definition of mixtures, the distribution function is a weighted sum (or integral) of the conditional distribution functions. It is easy to verify that for a random variable that is a mixture, its probability function (the discrete case) or the probability density function (the continuous case) is also the weighted sum (or integral) of the conditional probability functions or conditional probability density functions. In the following examples, we derive the density functions rather than the distribution functions.

Example 1
Consider the following family of gamma density functions where the parameter $\alpha$ takes on nonnegative integers and the parameter $\lambda$ is known:

$\displaystyle f_{X \lvert Y}(x \lvert \alpha)=\frac{\lambda^{\alpha+1}}{\alpha!} x^{\alpha} e^{-\lambda x}$

The parameter $Y=\alpha$ follows a geometric distribution with the following probability function:

$P_Y(\alpha)=p \thinspace (1-p)^\alpha$ where $\alpha=0,1,2,3,...$ and $0.

Then $X$ has an exponential distribution with parameter $\lambda p$. That is, the unconditional density function of $X$ is $\displaystyle f_X(x)=\lambda p \thinspace e^{-\lambda p x}$.

Example 2
In a particular block of insurance policies of an auto insurer, the claim frequency for a policyholder during a given year is modeled using a binomial distribution with $n=2$ and $p=\lambda$. There is uncertainty in the true value of the risk parameter $p=\lambda$. The insurer uses a beta distribution to model the parameter $p=\lambda$ (i.e. $\lambda$ ranges from 0 to 1 according to a beta distribution). A new customer just purchased a policy, find the probability mass function for the number of claims in the next year.

Discussion of Example 1
Since the mixing weights come from a discrete distribution, we have:

$\displaystyle f_X(x)=\sum \limits_{\alpha=0}^{\infty} f_{X \lvert Y}(x \lvert \alpha) \thinspace P_Y(\alpha)$

After plugging in the appropriate components, we have the following:

$\displaystyle f_X(x)=\sum \limits_{\alpha=0}^{\infty} \frac{\lambda^{\alpha+1}}{\alpha!} \thinspace x^{\alpha} \thinspace e^{-\lambda x} \thinspace p \thinspace (1-p)^\alpha$

The above sum is simplified as:

$f_X(x)=\lambda p \thinspace e^{-\lambda p \thinspace x}$.

Discussion of Example 2
The following is the probability function of the conditional claim frequency is:

$\displaystyle P_{N \lvert \Lambda}(n \lvert \lambda)=\binom{2}{n} \lambda^n (1-\lambda)^{2-n}$ where $n=0,1,2$.

On the other hand, the parameter $\Lambda$ has the beta distribution with parameters $\alpha$ and $\beta$. The density function is as follows:

$\displaystyle f_{\Lambda}(\lambda)=\frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha) \Gamma(\beta)} \thinspace \lambda^{\alpha-1} \thinspace (1-\lambda)^{\beta-1}$ where $\Gamma(\cdot)$ is the gamma function.

The following is the unconditional probability function of $N$:

$\displaystyle P_N(n)=\int_0^1 P_{N \lvert \Lambda}(n \lvert \lambda) \thinspace f_{\Lambda}(\lambda) \thinspace d \lambda$ where $n=0,1,2$.

The following sets up $P_N(n)$ for each $n=0,1,2$.

$\displaystyle P_N(0) = \int_0^1 (1-\lambda)^2 \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha) \Gamma(\beta)} \thinspace \lambda^{\alpha-1} \thinspace (1-\lambda)^{\beta-1} \thinspace d \lambda$

$\displaystyle P_N(1)=\int_0^1 2 \lambda (1-\lambda) \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha) \Gamma(\beta)} \thinspace \lambda^{\alpha-1} \thinspace (1-\lambda)^{\beta-1} \thinspace d \lambda$

$\displaystyle P_N(2) = \int_0^1 \lambda^2 \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha) \Gamma(\beta)} \thinspace \lambda^{\alpha-1} \thinspace (1-\lambda)^{\beta-1} \thinspace d \lambda$

The following shows the results of the calculation.

$\displaystyle P_N(0)=\frac{\beta (\beta+1)}{(\alpha+\beta) (\alpha+\beta+1)}$

$\displaystyle P_N(1)=\frac{2 \alpha \beta}{(\alpha+\beta) (\alpha+\beta+1)}$

$\displaystyle P_N(2)=\frac{\alpha (\alpha+1)}{(\alpha+\beta) (\alpha+\beta+1)}$