Skip to main content
Statistics LibreTexts

14.7: Compound Poisson Processes

  • Page ID
    10272
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\bs}{\boldsymbol}\) \(\newcommand{\var}{\text{var}}\) \(\newcommand{\sd}{\text{sd}}\) \(\newcommand{\skw}{\text{skew}}\) \(\newcommand{\kur}{\text{kurt}}\)

    In a compound Poisson process, each arrival in an ordinary Poisson process comes with an associated real-valued random variable that represents the value of the arrival in a sense. These variables are independent and identically distributed, and are independent of the underlying Poisson process. Our interest centers on the sum of the random variables for all the arrivals up to a fixed time \( t \), which thus is a Poisson-distributed random sum of random variables. Distributions of this type are said to be compound Poisson distributions, and are important in their own right, particularly since some surprising parametric distributions turn out to be compound Poisson.

    Basic Theory

    Definition

    Suppose we have a Poisson process with rate \( r \in (0, \infty) \). As usual, we wil denote the sequence of inter-arrival times by \( \bs{X} = (X_1, X_2, \ldots) \), the sequence of arrival times by \( \bs{T} = (T_0, T_1, T_2, \ldots) \), and the counting process by \( \bs{N} = \{N_t: t \in [0, \infty)\} \). To review some of the most important facts briefly, recall that \( \bs{X} \) is a sequence of independent random variables, each having the exponential distribution on \( [0, \infty) \) with rate \( r \). The sequence \( \bs{T} \) is the partial sum sequence associated with \( \bs{X} \), and has stationary independent increments. For \( n \in \N_+ \), the \( n \)th arrival time \( T_n \) has the gamma distribution with parameters \( n \) and \( r \). The process \( \bs{N} \) is the inverse of \( \bs{T} \), in a certain sense, and also has stationary independent increments. For \( t \in (0, \infty) \), the number of arrivals \( N_t \) in \( (0, t] \) has the Poisson distribution with parameter \( r t \).

    Suppose now that each arrival has an associated real-valued random variable that represents the value of the arrival in a certain sense. Here are some typical examples:

    • The arrivals are customers at a store. Each customer spends a random amount of money.
    • The arrivals are visits to a website. Each visitor spends a random amount of time at the site.
    • The arrivals are failure times of a complex system. Each failure requires a random repair time.
    • The arrivals are earthquakes at a particular location. Each earthquake has a random severity, a measure of the energy released.

    For \( n \in \N_+ \), let \( U_n \) denote the value of the \( n \)th arrival. We assume that \( \bs{U} = (U_1, U_2, \ldots) \) is a sequence of independent, identically distributed, real-valued random variables, and that \( \bs{U} \) is independent of the underlying Poisson process. The common distribution may be discrete or continuous, but in either case, we let \( f \) denote the common probability density function. We will let \( \mu = \E(U_n) \) denote the common mean, \( \sigma^2 = \var(U_n) \) the common variance, and \( G \) the common moment generating function, so that \( G(s) = \E\left[\exp(s U_n)\right] \) for \( s \) in some interval \( I \) about 0. Here is our main definition:

    The compound Poisson process associated with the given Poisson process \(\bs{N}\) and the sequence \( \bs{U} \) is the stochastic process \( \bs{V} = \{V_t: t \in [0, \infty)\} \) where \[ V_t = \sum_{n=1}^{N_t} U_n\]

    Thus, \( V_t \) is the total value for all of the arrivals in \( (0, t] \). For the examples above

    • \( V_t \) is the total income to the store up to time \( t \).
    • \( V_t \) is the total time spent at the site by the customers who arrived up to time \( t \).
    • \( V_t \) is the total repair time for the failures up to time \( t \).
    • \( V_t \) is the total energy released up to time \( t \).

    Recall that a sum over an empty index set is 0, so \( V_0 = 0 \).

    Properties

    Note that for fixed \( t \), \( V_t \) is a random sum of independent, identically distributed random variables, a topic that we have studied before. In this sense, we have a special case, since the number of terms \( N_t \) has the Poisson distribution with parameter \( r t\). But we also have a new wrinkle, since the process is indexed by the continuous time parameter \( t \), and so we can study its properties as a stochastic process. Our first result is a pair of properties shared by the underlying Poisson process.

    \( \bs{V} \) has stationary, independent increments:

    1. If \( s, \, t \in [0, \infty) \) with \( s \lt t \), then \( V_t - V_s \) has the same distribution as \( V_{t - s} \).
    2. If \( (t_1, t_2, \ldots, t_n)\) is a sequence of points in \( [0, \infty) \) with \(t_1 \lt t_2 \lt \cdots \lt t_n \) then \(\left(V_{t_1}, V_{t_2} - V_{t_1}, \ldots, V_{t_n} - V_{t_{n-1}}\right)\) is a sequence of independent variables.
    Proof
    1. For \( 0 \le s \lt t \), \[ V_t - V_s = \sum_{i = 1}^{N_t} U_i - \sum_{i = 1}^{N_s} U_i = \sum_{i = N_s + 1}^{N_t} U_i \] The number of terms in the last sum is \( N_t - N_s \), which has the same distribution as \( N_{t - s} \). Since the variables in the sequence \( \bs{U} \) are identically distributed, it follows that \( V_t - V_s \) has the same distribution as \( V_{t - s} \).
    2. Suppose that \( 0 \le t_1 \lt t_2 \lt \cdots \lt t_n \) and let \( t_0 = 0 \). Then for \( i \in \{1, 2, \ldots, n\} \), as in (a) \[ V_{t_i} - V_{t_i - 1} = \sum_{j = N_{t_i - 1} + 1}^{N_{t_i}} U_j \] The number of terms in this sum is \( N_{t_i} - N_{t_{i-1}} \). Since \( \bs{N} \) has independent increments, and the variables in \( \bs{U} \) are independent, and since the indices between \( N_{t_{i-1} + 1} \) and \( N_{t_i} \) are disjoint over \( i \in \{1, 2, \ldots n\} \), it follows that the random variables \( V_{t_i} - V_{t_i - 1} \) are independent over \( i \in \{1, 2, \ldots, n\} \).

    Next we consider various moments of the compound process.

    For \( t \in [0, \infty) \), the mean and variance of \( V_t \) are

    1. \( \E(V_t) = \mu r t \)
    2. \( \var(V_t) = (\mu^2 + \sigma^2) r t \)
    Proof

    Again, these are special cases of general results for random sums of IID variables, but we give separate proofs for completeness. The basic tool is conditional expected value and conditional variance. Recall also that \( \E(N_t) = \var(N_t) = r t \).

    1. Note that \( \E(V_t) = \E\left[\E(V_t \mid N_t)\right] = \E(\mu N_t) = \mu r t \).
    2. Similarly, note that \( \var(V_t \mid N_t) = \sigma^2 N_t \) and hence \( \var(V_t) = \E\left[\var(V_t \mid N_t)\right] + \var\left[\E(V_t \mid N_t)\right] = \E(\sigma^2 N_t) + \var(\mu N_t) = \sigma^2 r t + \mu^2 r t \).

    For \( t \in [0, \infty) \), the moment generating function of \( V_t \) is given by \[ \E\left[\exp(s V_t)\right] = \exp\left(r t \left[G(s) - 1\right]\right), \quad s \in I \]

    Proof

    Again, this is a special case of the more general result for random sums of IID variables, but we give a another proof for completeness. As with the last theorem, the key is to condition on \( N_t \) and recall that the MGF of a sum of independent variables is the product of the MGFs. Thus \[ \E\left[\exp(s V_t)\right] = \E\left(\E\left[\exp(s V_t \mid N_t)\right]\right) = \E\left[G^{N_t}(s)\right] = P_t\left[G(s)\right] \] where \( P_t \) is the probability generating function of \( N_t \). But we know from our study of the Poisson distribution that \( P_t(x) = \exp\left[r t (x - 1)\right] \) for \( x \in \R \).

    By exactly the same argument, the same relationship holds for characteristic functions and, in the case that the variables in \( \bs{U} \) take values in \( \N \), for probability generating functions.. That is, if the variables in \( \bs{U} \) have generating function \( G \), then the generating function \( H \) of \( V_t \) is given by \[ H(s) = \exp(r t [G(s) - 1]) \] for \( s \) in the domain of \( G \), where generating function can be any of the three types we have discussed: probability, moment, or characteristic.

    Examples and Special Cases

    The Discrete Case

    First we note that Thinning a Poisson process can be thought of as a special case of a compound Poisson process. Thus, suppose that \( \bs{U} = (U_1, U_2, \ldots) \) is a Bernoulli trials sequence with success parameter \( p \in (0, 1) \), and as above, that \( \bs{U} \) is independent of the Poisson process \( \bs{N} \). In the usual language of thinning, the arrivals are of two types (1 and 0), and \( U_i \) is the type of the \( i \)th arrival. Thus the compound process \( \bs{V} \) constructed above is the thinned process, so that \( V_t \) is the number of type 1 points up to time \( t \). We know that \( \bs{V} \) is also a Poisson process, with rate \( r p \).

    The results above for thinning generalize to the case where the values of the arrivals have a discrete distribution. Thus, suppose \( U_i \) takes values in a countable set \( S \subseteq \R \), and as before, let \( f \) denote the common probability density function so that \( f(u) = \P(U_i = u) \) for \( u \in S \) and \( i \in \N_+ \). For \( u \in S \), let \( N^u_t \) denote the number of arrivals up to time \( t \) that have the value \( u \), and let \( \bs{N}^u = \left\{N^u_t: t \in [0, \infty)\right\} \) denote the corresponding stochastic process. Armed with this setup, here is the result:

    The compound Poisson process \( \bs{V} \) associated with \( \bs{N} \) and \( \bs{U} \) can be written in the form \[ V_t = \sum_{u \in S} u N^u_t, \quad t \in [0, \infty) \] The processes \( \{\bs{N}^u: u \in S\} \) are independent Poisson processes, and \( \bs{N}^u \) has rate \( r f(u) \) for \( u \in S \).

    Proof

    Note that \( U_i = \sum_{u \in S} u \bs{1}(U_i = u) \) and hence \[ V_t = \sum_{i = 1}^{N_t} U_i = \sum_{i = 1}^{N_t} \sum_{u \in S} u \bs{1}(U_i = u) = \sum_{u \in S} u \sum_{i = 1}^{N_t} \bs{1}(U_i = u) = \sum_{u \in S} u N^u_t \] The fact that \( \{\bs{N}^u: u \in S\} \) are independent Poisson processes, and that \( \bs{N}^u \) has rate \( r f(u) \) for \( u \in S \) follows from our result on thinning.

    Compound Poisson Distributions

    A compound Poisson random variable can be defined outside of the context of a Poisson process. Here is the formal definition:

    Suppose that \( \bs{U} = (U_1, U_2, \ldots) \) is a sequence of independent, identically distributed random variables, and that \( N \) is independent of \( \bs{U} \) and has the Poisson distribution with parameter \( \lambda \in (0, \infty) \). Then \( V = \sum_{i=1}^N U_i \) has a compound Poisson distribution.

    But in fact, compound Poisson variables usually do arise in the context of an underlying Poisson process. In any event, the results on the mean and variance above and the generating function above hold with \( r t \) replaced by \( \lambda \). Compound Poisson distributions are infinitely divisible. A famous theorem of William Feller gives a partial converse: an infinitely divisible distribution on \( \N \) must be compound Poisson.

    The negative binomial distribution on \( \N \) is infinitely divisible, and hence must be compound Poisson. Here is the construction:

    Let \( p, \, k \in (0, \infty) \). Suppose that \( \bs{U} = (U_1, U_2, \ldots) \) is a sequence of independent variables, each having the logarithmic series distribution with shape parameter \( 1 - p \). Suppose also that \( N \) is independent of \( \bs{U} \) and has the Poisson distribution with parameter \( - k \ln(p) \). Then \( V = \sum_{i=1}^N U_i \) has the negative binomial distribution on \( \N \) with parameters \( k \) and \( p \).

    Proof

    As noted above, the probability generating function of \( V \) is \( P(t) = \exp\left( \lambda [Q(t) - 1]\right) \) where \( \lambda \) is the parameter of the Poisson variable \( N \) and \( Q(t) \) is the common PGF of the the terms in the sum. Using the PGF of the logarithmic series distribution, and the particular values of the parameters, we have \[ P(t) = \exp \left[-k \ln(p) \left(\frac{\ln[1 - (1 - p)t]}{\ln(p)} - 1\right)\right], \quad \left|t\right| \lt \frac{1}{1 - p} \] Using properties of logarithms and simple algebra, this reduces to \[ P(t) = \left(\frac{p}{1 - (1 - p)t}\right)^k, \quad \left|t\right| \lt \frac{1}{1 - p} \] which is the PGF of the negative binomial distribution with parameters \( k \) and \( p \).

    As a special case (\( k = 1 \)), it follows that the geometric distribution on \( \N \) is also compound Poisson.


    This page titled 14.7: Compound Poisson Processes is shared under a CC BY 2.0 license and was authored, remixed, and/or curated by Kyle Siegrist (Random Services) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.