# Compound mixed Poisson distribution

Let the random sum $Y=X_1+X_2+ \cdots +Y_N$ be the aggregate claims generated in a fixed period by an independent group of insureds. When the number of claims $N$ follows a Poisson distribution, the sum $Y$ is said to have a compound Poisson distribution. When the number of claims $N$ has a mixed Poisson distribution, the sum $Y$ is said to have a compound mixed Poisson distribution. A mixed Poisson distribution is a Poisson random variable $N$ such that the Poisson parameter $\Lambda$ is uncertain. In other words, $N$ is a mixture of a family of Poisson distributions $N(\Lambda)$ and the random variable $\Lambda$ specifies the mixing weights. In this post, we present several basic properties of compound mixed Poisson distributions. In a previous post (Compound negative binomial distribution), we showed that the compound negative binomial distribution is an example of a compound mixed Poisson distribution (with gamma mixing weights).

In terms of notation, we have:

• $Y=X_1+X_2+ \cdots +Y_N$,
• $N \sim$ Poisson$(\Lambda)$,
• $\Lambda \sim$ some unspecified distribution.

The following presents basic proeprties of the compound mixed Poisson $Y$ in terms of the mixing weights $\Lambda$ and the claim amount random variable $X$.

Mean and Variance

$\displaystyle E[Y]=E[\Lambda] E[X]$

$\displaystyle Var[Y]=E[\Lambda] E[X^2]+Var[\Lambda] E[X]^2$

Moment Generating Function

$\displaystyle M_Y(t)=M_{\Lambda}[M_X(t)-1]$

Cumulant Generating Function

$\displaystyle \Psi_Y(t)=ln M_{\Lambda}[M_X(t)-1]=\Psi_{\Lambda}[M_X(t)-1]$

Measure of Skewness
$\displaystyle E[(Y-\mu_Y)^3]=\Psi_Y^{(3)}(0)$

$\displaystyle =\Psi_{\Lambda}^{(3)}(0) E[X]^3 + 3 \Psi_{\Lambda}^{(2)}(0) E[X] E[X^2]+\Psi_{\Lambda}^{(1)}(0) E[X^3]$

$\displaystyle =\gamma_{\Lambda} Var[\Lambda]^{\frac{3}{2}} E[X]^3 + 3 Var[\Lambda] E[X] E[X^2]+E[\Lambda] E[X^3]$

Measure of skewness: $\displaystyle \gamma_Y=\frac{E[(Y-\mu_Y)^3]}{(Var[Y])^{\frac{3}{2}}}$

Previous Posts on Compound Distributions

# Compound negative binomial distribution

In this post, we discuss the compound negative binomial distribution and its relationship with the compound Poisson distribution.

A compound distribution is a model for a random sum $Y=X_1+X_2+ \cdots +X_N$ where the number of terms $N$ is uncertain. To make the compund distribution more tractable, we assume that the variables $X_i$ are independent and identically distributed and that each $X_i$ is independent of $N$. The random sum $Y$ can be interpreted the sum of all the measurements that are associated with certain events that occur during a fixed period of time. For example, we may be interested in the total amount of rainfall in a 24-hour period, during which the occurences of a number of events are observed and each of the events provides a measurement of an amount of rainfall. Another interpretation of compound distribution is the random variable of the aggregate claims generated by an insurance policy or a group of insurance policies during a fixed policy period. In this setting, $N$ is the number of claims generated by the portfolio of insurance policies and $X_1$ is the amount of the first claim and $X_2$ is the amount of the second claim and so on. When $N$ follows the Poisson distribution, the random sum $Y$ is said to have a compound Poisson distribution. Even though the compound Poisson distribution has many attractive properties, it is not a good model when the variance of the number of claims is greater than the mean of the number of claims. In such situations, the compound negative binomial distribution may be a better fit. See this post (Compound Poisson distribution) for a basic discussion. See the links at the end of this post for more articles on compound distributons that I posted on this blog.

Compound Negative Binomial Distribution
The random variable $N$ is said to have a negative binomial distribution if its probability function is given by the following:

$\displaystyle P[N=n]=\binom{\alpha + n-1}{\alpha-1} \thinspace \biggl(\frac{\beta}{\beta+1}\biggr)^{\alpha}\biggl(\frac{1}{\beta+1}\biggr)^{n} \ \ \ \ \ \ \ \ \ \ \ \ (1)$

where $n=0,1,2,3, \cdots$, $\beta >0$ and $\alpha$ is a positive integer.

Our formulation of negative binomial distribution is the number of failures that occur before the $\alpha^{th}$ success in a sequence of independent Bernoulli trials. But this interpretation is not important to our task at hand. Let $Y=X_1+X_2+ \cdots +X_N$ be the random sum as described in the above introductory paragraph such that $N$ follows a negative binomial distribution. We present the basic properties discussed in the post An introduction to compound distributions by plugging the negative binomial distribution into $N$.

Distribution Function
$\displaystyle F_Y(y)=\sum \limits_{n=0}^{\infty} F^{*n}(y) \thinspace P[N=n]$

where $F$ is the common distribution function for $X_i$ and $F^{*n}$ is the $n^{th}$ convolution of $F$. Of course, $P[N=n]$ is the negative binomial probability function indicated above.

Mean and Variance
$\displaystyle E[Y]=E[N] \thinspace E[X]=\frac{\alpha}{\beta} E[X]$

$\displaystyle Var[Y]=E[N] \thinspace Var[X]+Var[N] \thinspace E[X]^2$

$\displaystyle =\frac{\alpha}{\beta} Var[X]+\frac{\alpha (\beta+1)}{\beta^2} E[X]^2$

Moment Generating Function
$\displaystyle M_Y(t)=M_N[ln M_X(t)]=\biggl(\frac{p}{1-(1-p) M_X(t)}\biggr)^{\alpha}$

$\displaystyle M_Y(t)=\biggl(\frac{\beta}{\beta+1- M_X(t)}\biggr)^{\alpha}$

where $\displaystyle p=\frac{\beta}{\beta+1}$, $\displaystyle M_N(t)=\biggl(\frac{p}{1-(1-p) e^{t}}\biggr)^{\alpha}$

Cumulant Generating Function
$\displaystyle \Psi_Y(t)=\alpha \thinspace ln \biggl(\frac{\beta}{\beta+1- M_X(t)}\biggr)$

Skewness
$\displaystyle E[(Y-\mu_Y)^3]=\Psi_Y^{(3)}(0)$

$\displaystyle =\frac{2}{\alpha^2} E[N]^3 E[X]^3 +\frac{3}{\alpha} E[N]^2 E[X] E[X^2]+E[N] E[X^3]$

Measure of skewness: $\displaystyle \gamma_Y=\frac{E[(Y-\mu_Y)^3]}{(Var[Y])^{\frac{3}{2}}}$

Compound Mixed Poisson Distribution
In a previous post (Basic properties of mixtures), we showed that the negative binomial distribution is a mixture of a family of Poisson distributions with gamma mixing weights. Specifically, if $N \sim \text{Poisson}(\Lambda)$ and $\Lambda \sim \text{Gamma}(\alpha,\beta)$, then the unconditional distribution of $N$ is a negative binomial distribution and the probability function is of the form (1) given above.

Thus the negative binomial distribution is a special example of a compound mixed Poisson distribution. When an aggregate claims variable $Y=X_1+X_2+ \cdots +Y_N$ has a compound mixed Poisson distribution, the number of claims $N$ follows a Poisson distribution, but the Poisson parameter $\Lambda$ is uncertain. The uncertainty could be due to an heterogeneity of risks across the insureds in the insurance portfolio (or across various rating classes). If the information of the risk parameter $\Lambda$ can be captured in a gamma distribution, then the unconditional number of claims in a given fixed period has a negative binomial distribution.

Previous Posts on Compound Distributions
An introduction to compound distributions
Some examples of compound distributions
Compound Poisson distribution
Compound Poisson distribution-discrete example

# Compound Poisson distribution-discrete example

We present a discrete example of a compound Poisson distribution. A random variable $Y$ has a compound distribution if $Y=X_1+ \cdots +X_N$ where the number of terms $N$ is a discrete random variable whose support is the set of all nonnegative integers (or some appropriate subset) and the random variables $X_i$ are identically distributed (let $X$ be the common distribution). We further assume that the random variables $X_i$ are independent and each $X_i$ is independent of $N$. When $N$ follows the Poisson distribution, $Y$ is said to have a compound Poisson distribution. When the common distribution for the $X_i$ is continuous, $Y$ is a mixed distribution if $P[N=0]$ is nonzero. When the common distribution for the $X_i$ is discrete, $Y$ is a discrete distribution. In this post we present an example of a compound Poisson distribution where the common distribution $X$ is discrete. The compound distribution has a natural insurance interpretation (see the following links).

General Discussion
In general, the distribution function of a compound Poisson random variable $Y$ is the weighted average of all the $n^{th}$ convolutions of the common distribution function of the individual claim amount $X$. The following shows the form of such a distribution function:

$\displaystyle F_Y(y)=\sum \limits_{n=0}^{\infty} F^{*n}(y) P[N=n]$

where $\displaystyle F$ is the common distribution of the $X_n$ and $F^{*n}$ is the $n^{th}$ convolution of $F$.

If the distribution of the individual claim $X$ is discrete, we can obtain the probability mass function of $Y$ by convolutions as follows:

$\displaystyle f_Y(y)=P[Y=y]=\sum \limits_{n=0}^{\infty} p^{*n}(y) P[N=n]$

where $\displaystyle p^{*1}(y)=P[X=y]$
and $\displaystyle p^{*n}=p^* \cdots p^{*}(x)=P[X_1+X_2+ \cdots +X_n=y]$
and $\displaystyle p^{*0}(y)=\left\{\begin{matrix}0&\thinspace y \ne 0\\{1}&\thinspace x=0\end{matrix}\right.$

Example
Suppose the number of claims generated by a portfolio of insurance policies over a fixed time period has a Poisson distribution with parameter $\lambda$. Individual claim amounts will be 1 or 2 with probabilities 0.6 and 0.4, respectively. For the compound Poisson aggregate claims $Y=X_1+ \cdots +X_N$, find $P[Y=k]$ for $k=0,1,2,3,4$.

The probability mass function of $N$ is: $\displaystyle f_N(n)=\frac{\lambda^n e^{-\lambda}}{n!}$ where $n=0,1,2, \cdots$. The individual claim amounnt $X$ has a Bernoulli distribution since it is a two-valued discrete random variable. For convenience, we let $p=0.4$ (i.e. we consider $X=2$ is a success). Then the sum $X_1+ \cdots + X_n$ has a Binomial distribution. Consequently, the $n^{th}$ convolution $p^{*n}$ is simply the distribution function of Binomial(n,p). The following shows $p^{*n}$ for $n=1,2,3,4$.

$\displaystyle p^{*1}(1)=0.6, \thinspace p^{*1}(2)=0.4$

$\displaystyle p^{*2}(2)=\binom{2}{0} (0.4)^0 (0.6)^2=0.36$
$\displaystyle p^{*2}(3)=\binom{2}{1} (0.4)^1 (0.6)^1=0.48$
$\displaystyle p^{*2}(4)=\binom{2}{2} (0.4)^2 (0.6)^0=0.16$

$\displaystyle p^{*3}(3)=\binom{3}{0} (0.4)^0 (0.6)^3=0.216$
$\displaystyle p^{*3}(4)=\binom{3}{1} (0.4)^1 (0.6)^2=0.432$
$\displaystyle p^{*3}(5)=\binom{3}{2} (0.4)^2 (0.6)^1=0.288$
$\displaystyle p^{*3}(6)=\binom{3}{3} (0.4)^3 (0.6)^0=0.064$

$\displaystyle p^{*4}(4)=\binom{4}{0} (0.4)^0 (0.6)^4=0.1296$
$\displaystyle p^{*4}(5)=\binom{4}{1} (0.4)^1 (0.6)^3=0.3456$
$\displaystyle p^{*4}(6)=\binom{4}{2} (0.4)^2 (0.6)^2=0.3456$
$\displaystyle p^{*4}(7)=\binom{4}{3} (0.4)^3 (0.6)^1=0.1536$
$\displaystyle p^{*4}(8)=\binom{4}{4} (0.4)^4 (0.6)^0=0.0256$

Since we are interested in finding $P[Y=y]$ for $y=0,1,2,3,4$, we only need to consider $N=0,1,2,3,4$. The following matrix shows the relevant values of $p^{*n}$. The rows are for $y=0,1,2,3,4$. The columns are $p^{*0}$, $p^{*1}$, $p^{*2}$, $p^{*3}$, $p^{*4}$.

$\displaystyle \begin{pmatrix} 1&0&0&0&0 \\{0}&0.6&0&0&0 \\{0}&0.4&0.36&0&0 \\{0}&0&0.48&0.216&0 \\{0}&0&0.16&0.432&0.1296\end{pmatrix}$

To obtain the probability mass function of $Y$, we simply multiply each row by $P[N=n]$ where $n=0,1,2,3,4$.

$\displaystyle P[Y=0]=e^{-\lambda}$
$\displaystyle P[Y=1]=0.6 \lambda e^{-\lambda}$
$\displaystyle P[Y=2]=0.4 \lambda e^{-\lambda}+0.36 \frac{\lambda^2 e^{-\lambda}}{2}$
$\displaystyle P[Y=3]=0.48 \frac{\lambda^2 e^{-\lambda}}{2}+0.216 \frac{\lambda^3 e^{-\lambda}}{6}$
$\displaystyle P[Y=4]=0.16 \frac{\lambda^2 e^{-\lambda}}{2}+0.432 \frac{\lambda^3 e^{-\lambda}}{6}+0.1296 \frac{\lambda^4 e^{-\lambda}}{24}$