# 14.6: Non-homogeneous Poisson Processes

- Page ID
- 10271

## Basic Theory

A non-homogeneous Poisson process is similar to an ordinary Poisson process, except that the average rate of arrivals is allowed to vary with time. Many applications that generate random points in time are modeled more faithfully with such non-homogeneous processes. The mathematical cost of this generalization, however, is that we lose the property of stationary increments.

Non-homogeneous Poisson processes are best described in measure-theoretic terms. Thus, you may need to review the sections on measure theory in the chapters on Foundations, Probability Measures, and Distributions. Our basic measure space in this section is \( [0, \infty) \) with the \( \sigma \)-algebra of Borel measurable subsets (named for Émile Borel). As usual, \( \lambda \) denotes Lebesgue measure on this space, named for Henri Lebesgue. Recall that the Borel \( \sigma \)-algebra is the one generated by the intervals, and \( \lambda \) is the generalization of length on intervals.

### Definition and Basic Properties

Of all of our various characterizations of the ordinary Poisson process, in terms of the inter-arrival times, the arrival times, and the counting process, the characterizations involving the counting process leads to the most natural generalization to non-homogeneous processes. Thus, consider a process that generates random points in time, and as usual, let \( N_t \) denote the number of random points in the interval \( (0, t] \) for \( t \ge 0 \), so that \( \bs{N} = \{N_t: t \ge 0\} \) is the counting process. More generally, \( N(A) \) denotes the number of random points in a measurable \( A \subseteq [0, \infty) \), so \( N \) is our random counting measure. As before, \( t \mapsto N_t \) is a (random) distribution function and \( A \mapsto N(A) \) is the (random) measure associated with this distribution function.

Suppose now that \( r: [0, \infty) \to [0, \infty) \) is measurable, and define \( m: [0, \infty) \to [0, \infty) \) by \[ m(t) = \int_{(0, t]} r(s) \, d\lambda(s) \] From properties of the integral, \( m \) is increasing and right-continuous on \( [0, \infty) \) and hence is distribution function. The positive measure on \( [0, \infty) \) associated with \( m \) (which we will also denote by \( m \)) is defined on a measurable \( A \subseteq [0, \infty) \) by \[ m(A) = \int_A r(s) \, d\lambda(s) \] Thus, \( m(t) = m(0, t] \), and for \( s, \, t \in [0, \infty) \) with \( s \lt t \), \( m(s, t] = m(t) - m(s) \). Finally, note that the measure \( m \) is absolutely continuous with respect to \( \lambda \), and \( r \) is the density function. Note the parallels between the *random* distribution function and measure \( N \) and the *deterministic* distribution function and measure \( m \). With the setup involving \( r \) and \( m \) complete, we are ready for our first definition.

A process that produces random points in time is a non-homogeneous Poisson process with rate function \( r \) if the counting process \( N \) satisfies the following properties:

- If \( \{A_i: i \in I\} \) is a countable, disjoint collection of measurable subsets of \( [0, \infty) \) then \( \{N(A_i): i \in I\} \) is a collection of independent random variables.
- If \( A \subseteq [0, \infty) \) is measurable then \( N(A) \) has the Poisson distribution with parameter \( m(A) \).

Property (a) is our usual property of independent increments, while property (b) is a natural generalization of the property of Poisson distributed increments. Clearly, if \( r \) is a positive constant, then \( m(t) = r t \) for \( t \in [0, \infty) \) and as a measure, \( m \) is proportional to Lebesgue measure \( \lambda \). In this case, the non-homogeneous process reduces to an ordinary, homogeneous Poisson process with rate \( r \). However, if \( r \) is not constant, then \( m \) is not linear, and as a measure, is not proportional to Lebesgue measure. In this case, the process does not have stationary increments with respect to \( \lambda \), but does of course, have stationary increments with respect to \( m \). That is, if \( A, \, B \) are measurable subsets of \( [0, \infty) \) and \( \lambda(A) = \lambda(B) \) then \( N(A) \) and \( N(B) \) will not in general have the same distribution, but of course they will have the same distribution if \( m(A) = m(B) \).

In particular, recall that the parameter of the Poisson distribution is both the mean and the variance, so \( \E\left[N(A)\right] = \var\left[N(A)\right] = m(A) \) for measurable \( A \subseteq [0, \infty) \), and in particular, \( \E(N_t) = \var(N_t) = m(t) \) for \( t \in [0, \infty) \). The function \( m \) is usually called the mean function. Since \( m^\prime(t) = r(t) \) (if \( r \) is continuous at \( t \)), it makes sense to refer to \( r \) as the rate function. Locally, at \( t \), the arrivals are occurring at an average rate of \( r(t) \) per unit time.

As before, from a modeling point of view, the property of independent increments can reasonably be evaluated. But we need something more primitive to replace the property of Poisson increments. Here is the main theorem.

A process that produces random points in time is a non-homogeneous Poisson process with rate function \( r \) if and only if the counting process \( \bs{N} \) satisfies the following properties:

- If \( \{A_i: i \in I\} \) is a countable, disjoint collection of measurable subsets of \( [0, \infty) \) then \( \{N(A_i): i \in I\} \) is a set of independent variables.
- For \( t \in [0, \infty) \), \begin{align} &\frac{\P\left[N(t, t + h] = 1\right]}{h} \to r(t) \text{ as } h \downarrow 0 \\ &\frac{\P\left[N(t, t + h] > 1\right]}{h} \to 0 \text{ as } h \downarrow 0 \end{align}

So if \( h \) is small

the probability of a single arrival in \( [t, t + h) \) is approximately \( r(t) h \), while the probability of more than 1 arrival in this interval is negligible.

### Arrival Times and Time Change

Suppose that we have a non-homogeneous Poisson process with rate function \( r \), as defined above. As usual, let \( T_n \) denote the time of the \( n \)th arrival for \( n \in \N \). As with the ordinary Poisson process, we have an inverse relation between the counting process \( \bs{N} = \{N_t: t \in [0, \infty)\} \) and the arrival time sequence \( \bs{T} = \{T_n: n \in \N\} \), namely \( T_n = \min\{t \in [0, \infty): N_t = n\}\), \(N_t = \#\{n \in \N: T_n \le t\} \), and \( \{T_n \le t\} = \{N_t \ge n\} \), since both events mean at least \( n \) random points in \( (0, t] \). The last relationship allows us to get the distribution of \( T_n \).

For \( n \in \N_+ \), \( T_n \) has probability density function \( f_n \) given by \[ f_n(t) = \frac{m^{n-1}(t)}{(n - 1)!} r(t) e^{-m(t)}, \quad t \in [0, \infty) \]

## Proof

Using the inverse relationship above and the Poisson distribution of \( N_t \), the distribution function of \( T_n \) is \[ \P(T_n \le t) = \P(N_t \ge n) = \sum_{k=n}^\infty e^{-m(t)} \frac{m^k(t)}{k!}, \quad t \in [0, \infty) \] Differentiating with respect to \( t \) gives \[ f_n(t) = \sum_{k=n}^\infty \left[-m^\prime(t) e^{-m(t)} \frac{m^k(t)}{k!} + e^{-m(t)} \frac{k m^{k-1}(t) m^\prime(t)}{k!}\right] = r(t) e^{-m(t)} \sum_{k=n}^\infty \left[\frac{m^{k-1}(t)}{(k - 1)!} - \frac{m^k(t)}{k!}\right] \] The last sum collapses to \( m^{n-1}(t) \big/ (n - 1)! \).

In particular, \( T_1 \) has probability density function \( f_1 \) given by \[ f_1(t) = r(t) e^{-m(t)}, \quad t \in [0, \infty) \] Recall that in reliability terms, \( r \) is the failure rate function, and that the reliability function is the right distribution function: \[ F_1^c(t) = \P(T_1 \gt t) = e^{-m(t)}, \quad t \in [0, \infty) \] In general, the functional form of \( f_n \) is clearly similar to the probability density function of the gamma distribution, and indeed, \( T_n \) can be transformed into a random variable with a gamma distribution. This amounts to a time change which will give us additional insight into the non-homogeneous Poisson process.

Let \( U_n = m(T_n) \) for \( n \in \N_+ \). Then \( U_n \) has the gamma distribution with shape parameter \( n \) and rate parameter \( 1 \)

## Proof

Let \( g_n \) denote the PDF of \( U_n \). Since \( m \) is strictly increasing and differentiable, we can use the standard change of variables formula. So letting \( u = m(t) \), the relationship is \[ g_n(u) = f_n(t) \frac{dt}{du} \] Simplifying gives \( g_n(u) = u^{n-1} e^{-u} \big/(n - 1)! \) for \( u \in [0, \infty) \).

Thus, the time change \( u = m(t) \) transforms the non-homogeneous Poisson process into a standard (rate 1) Poisson process. Here is an equivalent way to look at the time change result.

For \( u \in [0, \infty) \), let \( M_u = N_t \) where \( t = m^{-1}(u) \). Then \( \{M_u: u \in [0, \infty)\} \) is the counting process for a standard, rate 1 Poisson process.

## Proof

- Suppose that \( (u_1, u_2, \ldots) \) os a sequence of points in \( [0, \infty) \) with \( 0 \le u_1 \lt u_2 \lt \cdots \). Since \( m^{-1} \) is strictly increasing, we have \( 0 \le t_1 \lt t_2 \lt \cdots \), where of course \( t_i = m^{-1}(u_i) \). By assumption, the sequence of random variables \( \left(N_{t_1}, N_{t_2} - N_{t_1}, \ldots\right) \) is independent, but this is also the sequence \( \left(M_{u_1}, M_{u_2} - M_{u_1}, \ldots\right) \).
- Suppose that \( u, \, v \in [0, \infty) \) with \( u \lt v \), and let \( s = m^{-1}(u) \) and \( t = m^{-1}(v) \). Then \( s \lt t \) and so \( M_v - M_u = N_t - N_s \) has the Poisson distribution with parameter \( m(t) - m(s) = v - u \).

Equivalently, we can transform a standard (rate 1) Poisson process into a a non-homogeneous Poisson process with a time change.

Suppose that \( \bs{M} = \{M_u: u \in [0, \infty)\} \) is the counting process for a standard Poisson process, and let \( N_t = M_{m(t)} \) for \( t \in [0, \infty) \). Then \( \{N_t: t \in [0, \infty)\} \) is the counting process for a non-homogeneous Poisson process with mean function \( m \) (and rate function \( r \)).

## Proof

- Let \( (t_1, t_2, \ldots) \) be a sequence of points in \( [0, \infty) \) with \( 0 \le t_1 \lt t_2 \lt \cdots \). Since \( m \) is strictly increasing, we have \( 0 \le m(t_1) \lt m(t_2) \lt \cdots \). Hence \( \left(M_{m(t_1)}, M_{m(t_2)} - M_{m(t_1)}, \ldots\right) \) is a sequence of independent variables. But this sequence is simply \( \left(N_{t_1}, N_{t_2} - N_{t_1}, \ldots\right) \).
- If \( s, \, t \in [0, \infty) \) with \( s \lt t \). Then \( N_t - N_s = M_{m(t)} - M_{m(s)} \) has the Poisson distribution with parameter \( m(t) - m(s) \).