In this post, we discuss some basic properties of mixtures (see these two previous posts for examples of mixtures – The law of total probability and Examples of mixtures). We also present the example of the negative binomial distribution. We show that the negative binomial distribution is a mixture of Poisson distributions with gamma mixing weights.

A random variable is a mixture if its distribution is a weighted sum (or integral) of a family of distribution functions where is the mixing random variable. More specifically, is a mixture if its distribution function is of one of the following two forms:

In the first case, is a discrete mixture (i.e. the discrete random variable provides the weights). In the second case, is a continuous mixture (i.e. the continuous random variable provides the weights). In either case, the distribution of is said to be the mixing variable (its distribution is the mixing distribution). In some probability and statistics texts, the notion of mixtures is called compounding.

Mixtures arise in many settings. The notion of mixtures is important in insurance applications (e.g. when the risk class of a policyholder is uncertain). The distribution for modeling the random loss for an insured risk (or a group of insured risks) is often a mixture distribution. Discrete mixture arises when the risk classifications is discrete. Continuous mixture is important for the situations where the random loss distribution has an uncertain risk parameter and the risk parameter follows a continuous distribution. See these two previous posts for examples of mixtures – The law of total probability and Examples of mixtures.

**Unconditional Expectation**

Let be a mixture and be the mixing variable. Let be a continuous function. We show the following fact about unconditional expectation. This formula is used below for establishing basic facts of mixtures.

The following is the derivation:

**Basic Properties of a Mixture**

Let be a mixture with mixing variable . The unconditional mean, variance and moment generating function of are:

(1)

(2)

(3)

**Discussion of (1)**

The statement (1) is called the total expectation and follows from the unconditional expectation formula (*) above. Suppose is the random loss amount for an insured whose risk class is uncertain. The formula (1) states that the the average loss is the average of averages. The idea is to find the average loss for each risk class and then take the weighted average of these averages according to the distribution of the mixing variable (e.g. the distribution of the policyholders by risk class).

**Discussion of (2)**

The statement (2) is called the total variance. Sticking with the example of insured risks in a block of insurance policies, the total variance of the random loss for an insured comes from two sources – the average of the variation in each risk class plus the variation of the average loss in each risk class. As we will see in the example below and in subsequent posts, the uncertainty in a risk parameter in the distribution of (through the conditioning of ) has the effect of increasing the variance in the unconditional random loss. The following derivations establish the formula for total variance.

On the other hand,

**Discussion of (3)**

This follows from the unconditional expectation formula (*) above. Note that

**Example**

We show that the negative binomial distribution is a mixture of a family of Poisson distributions with gamma mixing weights. We then derive the mean, variance and moment generating function of the negative binomial distribution.

The following is the conditional Poisson probability function:

where

The Poisson parameter is uncertain and it follows a gamma distribution with parameter and with the following density function:

The unconditional probability function of is:

Note that the above unconditional probability function is that of a negative binomial distribution with parameters and . Note that and . We now compute the unconditional mean , the total variance and the moment generating function . Let .

To derive the moment generating function, note that the conditional Poisson mgf is .

**Comment**

In the above example, note that . This stands in contrast with the Poisson distribution () and with the binomial distribution (). In the above example, there is uncertainty in the risk parameter in the conditional Poisson distribution. The additional uncertainty causes the unconditional variance to increase.