Examples of Bayesian prediction in insurance-continued

This post is a continuation of the previous post Examples of Bayesian prediction in insurance. We present another example as an illustration of the methodology of Bayesian estimation. The example in this post, along with the example in the previous post, serve to motivate the concept of Bayeisan credibility and Buhlmann credibility theory. So these two posts are part of an introduction to credibility theory.

Suppose X_1, \cdots, X_n,X_{n+1} are independent and identically distributed conditional on \Theta=\theta. We denote the density function of the common distribution of X_j by f_{X \lvert \Theta}(x \lvert \theta). We denote the prior distribution of the risk parameter \Theta by \pi_{\Theta}(\theta). The following shows the steps of the Bayesian estimate of the next observation X_{n+1} given X_1, \cdots, X_n.

The Marginal Distribution
\displaystyle f_{X_1, \cdots, X_n}(x_1, \cdots, x_n)=\int \limits_{\theta} \biggl[\prod \limits_{i=1}^{n} f_{X \lvert \Theta}(x_i \lvert \theta)\biggr] \pi_{\Theta}(\theta) d \theta

The Posterior Distribution
\displaystyle \pi_{\Theta \lvert X_1, \cdots, X_n}(\theta \lvert x_1, \cdots, x_n)

\displaystyle =\ \ \ \ \ \ \ \ \ \ \frac{1}{f_{X_1, \cdots, X_n}(x_1, \cdots, x_n)} \biggl[\prod \limits_{i=1}^{n} f_{X \lvert \Theta}(x_i \lvert \theta)\biggr] \pi_{\Theta}(\theta)

The Predictive Distribution
\displaystyle f_{X_{n+1} \lvert X_1, \cdots, X_n}(x \vert x_1, \cdots, x_n)

\displaystyle =\ \ \ \ \ \ \ \ \ \ \int \limits_{\theta} \biggl(f_{X \lvert \Theta}(x \lvert \theta)\biggr) \thinspace \pi_{\Theta \lvert X_1, \cdots, X_n}(\theta \lvert x_1, \cdots, x_n) d \theta

The Bayesian Predictive Mean of the Next Period
\displaystyle E[X_{n+1} \lvert X_1=x_1, \cdots, X_n=x_n]

\displaystyle =\ \ \ \ \ \ \ \ \ \ \int \limits_{x} x \thinspace f_{X_{n+1} \lvert X_1, \cdots, X_n}(x \vert x_1, \cdots, x_n) dx

\displaystyle E[X_{n+1} \lvert X_1=x_1, \cdots, X_n=x_n]

\displaystyle =\ \ \ \ \ \ \ \ \ \ \int \limits_{\theta} E[X \lvert \theta] \thinspace \pi_{\Theta \lvert X_1, \cdots, X_n}(\theta \lvert x_1, \cdots, x_n) d \theta

Example 2
The number of claims X generated by an insured in a potfolio of independent insurance policies has a Poisson distribution with parameter \Theta. In the portfolio of policies, the parameter \Theta varies according to a gamma distribution with parameters \alpha and \beta. We have the following conditional distributions of X and prior distribution of \theta.

\displaystyle f_{X \lvert \Theta}(x \lvert \theta)=\frac{\theta^x e^{-\theta}}{x!} where x=0,1,2, \cdots

\displaystyle \pi_{\Theta}(\theta)=\frac{\beta^{\alpha}}{\Gamma(\alpha)} \theta^{\alpha-1} e^{-\beta \theta} where \Gamma(\cdot) is the gamma function.

Suppose that a particular insured in this portfolio has generated 0 and 3 claims in the first 2 policy periods. What is the Bayesian estimate of the number of claims for this insured in period 3?

Note that the conditional mean E[X \lvert \Theta]=\Theta. Thus the unconditional mean E[X]=E[\Theta]=\frac{\alpha}{\beta}.

Comment
Note that the unconditional distribution of X is a negative binomial distribution. In a previous post (Compound negative binomial distribution), it was shown that if N \sim Poisson(\Lambda) and \Lambda \sim Gamma(\alpha,\beta), then the unconditional distribution of X has the following probability function. We make use of this result in the Bayesian estimation problem in this post.

\displaystyle P[N=n]=\frac{\Gamma(\alpha+n}{\Gamma(\alpha) \Gamma(n)} \biggl[\frac{\beta}{\beta+1}\biggr]^{\alpha} \biggl[\frac{1}{\beta+1}\biggr]^n

The Marginal Distribution
\displaystyle f_{X_1,X_2}(0,3)=\int_{0}^{\infty} e^{-\theta} \frac{\theta^3 e^{-\theta}}{3!} \frac{\beta^{\alpha}}{\Gamma(\alpha)} \theta^{\alpha-1} e^{-\beta \theta} d \theta

\displaystyle =\int_{0}^{\infty} \frac{\beta^{\alpha}}{3! \Gamma(\alpha)} \theta^{\alpha+3-1} e^{\beta+2} d \theta=\frac{\Gamma(\alpha+3)}{6 \Gamma(\alpha)} \frac{\beta^{\alpha}}{(\beta+2)^{\alpha+3}}

The Posterior Distribution
\displaystyle \pi_{\Theta \lvert X_1,X_2}(\theta \lvert 0,3)=\frac{1}{f_{X_1,X_2}(0,3)} e^{-\theta} \frac{\theta^3 e^{-\theta}}{3!} \frac{\beta^{\alpha}}{\Gamma(\alpha)} \theta^{\alpha-1} e^{-\beta \theta}

\displaystyle =K \thinspace \theta^{\alpha+3-1} e^{-(\beta+2) \theta}

In the above expression K is a constant making \pi_{\Theta \lvert X_1,X_2}(\theta \lvert 0,3) a density function. Note that it has the form of a gamma distribution. Thus the posterior distribution must be:

\displaystyle \pi_{\Theta \lvert X_1,X_2}(\theta \lvert 0,3)=\frac{(\beta+2)^{\alpha+1}}{\Gamma(\alpha+3)} \thinspace \theta^{\alpha+3-1} e^{-(\beta+2) \theta}

Thus the posterior distribution of \Theta is a gamma distribution with parameter \alpha+3 and \beta+2.

The Predictive Distribution
Note that the predictive distribution is simply the mixture of Poisson(\Theta) with Gamma(\alpha+3,\beta+2) as mixing weights. By the comment above, the predictive distribution is a negative binomial distribution with the following probability function:

\displaystyle f_{X_3 \lvert X_1,X_2}(x \lvert 0,3)=\frac{\Gamma(\alpha+5)}{\Gamma(\alpha+3) \Gamma(2)} \biggl[\frac{\beta+2}{\beta+3}\biggr]^{\alpha+3} \biggl[\frac{1}{\beta+3}\biggr]^{2}

The Bayesian Predictive Mean
\displaystyle E[X_3 \lvert 0,3]=\frac{\alpha+3}{\beta+2}=\frac{2}{\beta+2} \biggl(\frac{3}{2}\biggr)+\frac{\beta}{\beta+2} \biggl(\frac{\alpha}{\beta}\biggr) \ \ \ \ \ \ \ \ \ \ (1)

Note that E[X \lvert \Theta]=\Theta. Thus the Bayesian predictive mean in this example is simply the mean of the posterior distribution of \Theta, which is E[\Theta \vert 0,3]=\frac{\alpha+3}{\beta+2}.

Comment
Generalizing the example, suppose that in the first n periods, the claim counts for the insured are X_1=x_1, \cdots, X_n=x_n. Then the posterior distribution of the parameter \Theta is a gamma distribution.

\biggl[\Theta \lvert X_1=x_1, \cdots, X_n=x_n\biggr] \sim Gamma(\alpha+\sum_{i=1}^{n} x_i,\beta+n)

Then the predictive distribution of X_{n+1} given the observations has a negative binomial distribution. More importantly, the Bayesian predictive mean is:

\displaystyle E[X_{n+1} \lvert X_1=x_1, \cdots, X_n=x_n]=\frac{\alpha+\sum_{i=1}^{n} x_i}{\beta+n}

\displaystyle =\frac{n}{\beta+n} \biggl(\frac{\sum \limits_{i=0}^{n} x_i}{n}\biggr)+\frac{\beta}{\beta+n} \biggl(\frac{\alpha}{\beta}\biggr)\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (2)

It is interesting that the Bayesian predictive mean of the (n+1)^{th} period is a weighted average of the mean of the observed data (\overline{X}) and the unconditional mean E[X]=\frac{\alpha}{\beta}. Consequently, the above Bayesian estimate is a credibility estimate. The weight given to the observed data Z=\frac{n}{\beta+n} is called the credibility factor. The estimate and the factor are called Bayesian credibility estimate and Bayesian credibility factor, respectively.

In general, the credibility estimate is an estimator of the following form:

\displaystyle E=Z \thinspace \overline{X}+ (1-Z) \thinspace \mu_0

where \overline{X} is the mean of the observed data and \mu_0 is the mean based on other information. In our example here, \mu_0 is the unconditional mean. In practice, \mu_0 could be the mean based on the entire book of business, or a mean based on a different block of similar insurance policies. Another interpretation is that \overline{X} is the mean of the recent experience data and \mu_0 is the mean of prior periods.

One more comment about the credibility factor Z=\frac{n}{\beta+n} derived in this example. As n \rightarrow \infty, Z \rightarrow 1. This makes intuitive sense since this gives more weight to \overline{X} as more and more data are accumulated.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s