# 14.8: Poisson Processes on General Spaces



## Basic Theory

### The Process

So far, we have studied the Poisson process as a model for random points in time. However there is also a Poisson model for random points in space. Some specific examples of such random points are

• Defects in a sheet of material.
• Raisins in a cake.
• Stars in the sky.

The Poisson process for random points in space can be defined in a very general setting. All that is really needed is a measure space $$(S, \mathscr{S}, \mu)$$. Thus, $$S$$ is a set (the underlying space for our random points), $$\mathscr{S}$$ is a $$\sigma$$-algebra of subsets of $$S$$ (as always, the allowable sets), and $$\mu$$ is a positive measure on $$(S, \mathscr{S})$$ (a measure of the size of sets). The most important special case is when $$S$$ is a (Lebesgue) measurable subset of $$\R^d$$ for some $$d \in \N_+$$, $$\mathscr{S}$$ is the $$\sigma$$-algebra of measurable subsets of $$S$$, and $$\mu = \lambda_d$$ is $$d$$-dimensional Lebesgue measure. Specializing further, recall the lower dimensional spaces:

1. When $$d = 1$$, $$S \subseteq \R$$ and $$\lambda_1$$ is length measure.
2. When $$d = 2$$, $$S \subseteq \R^2$$ and $$\lambda_2$$ is area measure.
3. When $$d = 3$$, $$S \subseteq \R^3$$ and $$\lambda_3$$ is volume measure.

Of course, the characterizations of the Poisson process on $$[0, \infty)$$, in term of the inter-arrival times and the characterization in terms of the arrival times do not generalize because they depend critically on the order relation on $$[0, \infty)$$. However the characterization in terms of the counting process generalizes perfectly to our new setting. Thus, consider a process that produces random points in $$S$$, and as usual, let $$N(A)$$ denote the number of random points in $$A \in \mathscr{S}$$. Thus $$N$$ is a random, counting measure on $$(S, \mathscr{S})$$

The random measure $$N$$ is a Poisson process or a Poisson random measure on $$S$$ with density parameter $$r \gt 0$$ if the following axioms are satisfied:

1. If $$A \in \mathscr{S}$$ then $$N(A)$$ has the Poisson distribution with parameter $$r \mu(A)$$.
2. If $$\{A_i: i \in I\}$$ is a countable, disjoint collection of sets in $$\mathscr{S}$$ then $$\{N(A_i): i \in I\}$$ is a set of independent random variables.

To draw parallels with the Poison process on $$[0, \infty)$$, note that axiom (a) is the generalization of stationary, Poisson-distributed increments, and axiom (b) is the generalization of independent increments. By convention, if $$\mu(A) = 0$$ then $$N(A) = 0$$ with probability 1, and if $$\mu(A) = \infty$$ then $$N(A) = \infty$$ with probability 1. (These distributions are considered degenerate members of the Poisson family.) On the other hand, note that if $$0 \lt \mu(A) \lt \infty$$ then $$N(A)$$ has support $$\N$$.

In the two-dimensional Poisson process, vary the width $$w$$ and the rate $$r$$. Note the location and shape of the probability density function of $$N$$. For selected values of the parameters, run the simulation 1000 times and compare the empirical density function to the true probability density function.

For $$A \subseteq D$$

1. $$\E\left[N(A)\right] = r \mu(A)$$
2. $$\var\left[N(A)\right] = r \mu(A)$$
Proof

These result follow of course form our previous study of the Poisson distribution. Recall that the parameter of the Poisson distribution is both the mean and the variance.

In particular, $$r$$ can be interpreted as the expected density of the random points (that is, the expected number of points in a region of unit size), justifying the name of the parameter.

In the two-dimensional Poisson process, vary the width $$w$$ and the density parameter $$r$$. Note the size and location of the mean$$\pm$$standard deviation bar of $$N$$. For various values of the parameters, run the simulation 1000 times and compare the empirical mean and standard deviation to the true mean and standard deviation.

### The Distribution of the Random Points

As before, the Poisson model defines the most random way to distribute points in space, in a certain sense. Assume that we have a Poisson process $$N$$ on $$(S, \mathscr{S}, \mu)$$ with density parameter $$r \in (0, \infty)$$.

Given that $$A \in \mathscr{S}$$ contains exactly one random point, the position $$X$$ of the point is uniformly distributed on $$A$$.

Proof

For $$B \in \mathscr{S}$$ with $$B \subseteq A$$, $\P\left[N(B) = 1 \mid N(A) = 1\right] = \frac{\P\left[N(B) = 1, N(A) = 1\right]}{\P\left[N(A) = 1\right]} = \frac{\P\left[N(B) = 1, N(A \setminus B) = 0\right]}{\P\left[N(A) = 1\right]} = \frac{\P\left[N(B) = 1\right] \P\left[N(A \setminus B) = 0\right]}{\P\left[N(A) = 1\right]}$ Using the Poisson distributions we have $\P\left[N(B) = 1 \mid N(A) = 1\right] = \frac{\exp\left[-r \mu(B)\right] \left[r \mu(B)\right] \exp\left[-r \mu(A \setminus B)\right]} {\exp\left[-r \mu(A)\right] \left[r \mu(A)\right]} = \frac{\mu(B)}{\mu(A)}$ As a function of $$B$$, this is the uniform distribution on $$A$$ (with respect to $$\mu$$).

More generally, if $$A$$ contains $$n$$ points, then the positions of the points are independent and each is uniformly distributed in $$A$$.

Suppose that $$A, \, B \in \mathscr{S}$$ and $$B \subseteq A$$. For $$n \in \N_+$$, the conditional distribution of $$N(B)$$ given $$N(A) = n$$ is the binomial distribution with trial parameter $$n$$ and success parameter $$p = \mu(B) \big/ \mu(A)$$.

Proof

For $$k \in \{0, 1, \ldots, n\}$$, $\P\left[N(B) = k \mid N(A) = n\right] = \frac{\P\left[N(B) = k, N(A) = n\right]}{\P\left[N(A) = n\right]} = \frac{\P\left[N(B) = k, N(A \setminus B) = n - k\right]}{\P\left[N(A) = n\right]} = \frac{\P\left[N(B) = k\right] \P\left[N(A \setminus B) = n - k\right]}{\P\left[N(A) = n\right]}$ Using the Poisson distribtuions, $\P\left[N(B) = k \mid N(A) = n\right] = \frac{\exp\left[-r \mu(B)\right] \left(\left[r \mu(B)\right]^k \big/ k!\right) \exp\left[-r \mu(A \setminus B)\right] \left(\left[r \mu(A \setminus B)\right]^{n-k} \big/ (n - k)!\right)}{\exp\left[-r \mu(A)\right] \left[r \mu(A)\right]^n \big/ n!}$ Canceling factors and letting $$p = \mu(B) \big/ \mu(A)$$, we have $\P\left[N(B) = k \mid N(A) = n\right] = \frac{n!}{k! (n-k)!} p^k (1 - p)^{n-k}$

Thus, given $$N(A) = n$$, each of the $$n$$ random points falls into $$B$$, independently, with probability $$p = \mu(B) \big/ \mu(A)$$, regardless of the density parameter $$r$$.

More generally, suppose that $$A \in \mathscr{S}$$ and that $$A$$ is partitioned into $$k$$ subsets $$(B_1, B_2, \ldots, B_k)$$ in $$\mathscr{S}$$. Then the conditional distribution of $$\left(N(B_1), N(B_2), \ldots, N(B_k)\right)$$ given $$N(A) = n$$ is the multinomial distribution with parameters $$n$$ and $$(p_1, p_2, \ldots p_k)$$, where $$p_i = \mu(B_i) \big/ \mu(A)$$ for $$i \in \{1, 2, \ldots, k\}$$.

### Thinning and Combining

Suppose that $$N$$ is a Poisson random process on $$(S, \mathscr{S}, \mu)$$ with density parameter $$r \in [0, \infty)$$. Thinning (or splitting) this process works just like thinning the Poisson process on $$[0, \infty)$$. Specifically, suppose that the each random point, independently of the others is either type 1 with probability $$p$$ or type 0 with probability $$1 - p$$, where $$p \in (0, 1)$$ is a new parameter. Let $$N_1$$ and $$N_0$$ denote the random counting measures associated with the type 1 and type 0 points, respectively. That is, $$N_i(A)$$ is the number of type $$i$$ random points in $$A$$, for $$A \in \mathscr{S}$$ and $$i \in \{0, 1\}$$.

$$N_0$$ and $$N_1$$ are independent Poisson processes on $$(S, \mathscr{S}, \mu)$$ with density parameters $$p r$$ and $$(1 - p) r$$, respectively.

Proof

The proof is like the one for the Poisson process on $$[0, \infty)$$. For $$j, \; k \in \N$$, $\P\left[N_0(A) = j, N_1(A) = k\right] = \P\left[N_1(A) = k, N(A) = j + k\right] = \P\left[N(A) = j + k\right] \P\left[N_1(A) = k \mid N_0(A) = j + k\right]$ But given $$N(A) = n$$, the number of type 1 points $$N_1(A)$$ has the binomial distribution with parameters $$n$$ and $$p$$. Hence letting $$t = \mu(A)$$ to simplify the notation, we have $\P\left[N_0(A) = j, N_1(A) = k\right] = e^{-r t} \frac{(r t)^{j+k}}{(j + k)!} \frac{(j + k)!}{j! k!} p^k (1 - p)^j = e^{-p r t} \frac{(p r t)^k}{k!} e^{-(1 - p) r t} \frac{\left[(1 - p) r t\right]^j}{j!}$ It follows from the factorization theorem that $$N_0(A)$$ has the Poisson distribution with parameter $$p \mu(A)$$, $$N_1(A)$$ has the Poisson distribution with parameter $$(1 - p) \mu(A)$$, and $$N_0(A)$$ and $$N_1(A)$$ are independent. Next suppose that $$\{A_i: i \in I\}$$ is a countable, disjoint collection of sets in $$\mathscr{S}$$. Then $$\{N_0(A_i): i \in I\}$$ and $$\{N_1(A_i): i \in I\}$$ are each independent sets of random variables, and the two sets are independent of each other.

This result extends naturally to $$k \in \N_+$$ types. As in the standard case, combining independent Poisson processes produces a new Poisson process, and the density parameters add.

Suppose that $$N_0$$ and $$N_1$$ are independent Poisson processes on $$(S, \mathscr{S}, \mu)$$, with density parameters $$r_0$$ and $$r_1$$, respectively. Then the process obtained by combining the random points is also a Poisson process on $$(S, \mathscr{S}, \mu)$$ with density parameter $$r_0 + r_1$$.

Proof

The new random measure, of course, is simply $$N = N_1 + N_2$$. Thus for $$A \in \mathscr{S}$$, $$N(A) = N_1(A) + N_2(A)$$. But $$N_i(A)$$ has the Poisson distribution with parameter $$r_i \mu(A)$$ for $$i \in \{1, 2\}$$, and the variables are independent, so $$N(A)$$ has the Poisson distribution with parameter $$r_0 \mu(A) + r_1 \mu(A) = (r_0 + r_1)\mu(A)$$. Next suppose that $$\{A_i: i \in I\}$$ is a countable, disjoint collection of sets in $$\mathscr{S}$$. Then $$\{N(A_i): i \in I\} = \left\{N_0(A_i) + N_1(A_i): i \in I\right\}$$ is a set of independent random variables.

## Applications and Special Cases

### Non-homogeneous Poisson Processes

A non-homogeneous Poisson process on $$[0, \infty)$$ can be thought of simply as a Poisson process on $$[0, \infty)$$ with respect to a measure that is not the standard Lebesgue measure $$\lambda_1$$ on $$[0, \infty)$$. Thus suppose that $$r: [0, \infty) \to (0, \infty)$$ is piece-wise continuous with $$\int_0^\infty r(t) \, dt = \infty$$, and let $m(t) = \int_0^t r(s) \, ds, \quad t \in [0, \infty)$ Consider the non-homogeneous Poisson process with rate function $$r$$ (and hence mean function $$m$$). Recall that the Lebesgue-Stieltjes measure on $$[0, \infty)$$ associated with $$m$$ (which we also denote by $$m$$) is defined by the condition $m(a, b] = m(b) - m(a), \quad a, \, b \in [0, \infty), \; a \lt b$ Equivalently, $$m$$ is the measure that is absolutely continuous with respect to $$\lambda_1$$, with density function $$r$$. That is, if $$A$$ is a measurable subset of $$[0, \infty)$$ then $m(A) = \int_A r(t) \, dt$

The non-homogeneous Poisson process on $$[0, \infty)$$ with rate function $$r$$ is the Poisson process on $$[0, \infty)$$ with respect to the measure $$m$$.

Proof

This follows directly from the definitions. If $$N$$ denotes the counting process associated with the non-homogeneous Poisson process, then $$N$$ has stationary increments, and for $$s, \, t \in [0, \infty)$$ with $$s \lt t$$, $$N(s, t]$$ has the Poisson distribution with parameter $$m(t) - m(s) = m(s, t]$$.

### Nearest Points in $$\R^d$$

In this subsection, we consider a rather specialized topic, but one that is fun and interesting. Consider the Poisson process on $$\left(\R^d, \mathscr{R}_d, \lambda_d\right)$$ with density parameter $$r \gt 0$$, where as usual, $$\mathscr{R}_d$$ is the $$\sigma$$-algebra of Lebesgue measurable subsets of $$\R_d$$, and $$\lambda_d$$ is $$d$$-dimensional Lebesgue measure. We use the usual Euclidean norm on $$\R^d$$: $\|\bs{x}\|_d = \left(x_1^d + x_2^d \cdots + x_d^d\right)^{1/d}, \quad \bs{x} = (x_1, x_2, \ldots, x_d) \in \R^d$ For $$t \gt 0$$, let $$B_t = \left\{\bs{x} \in \R^d: \|\bs{x}\|_d \le t\right\}$$ denote the ball of radius $$t$$ centered at the origin. Recall that $$\lambda_d(B_t) = c_d t^d$$ where $c_d = \frac{\pi^{d/2}}{\Gamma(d/2 + 1)}$ is the measure of the unit ball in $$\R^d$$, and where $$\Gamma$$ is the gamma function. Of course, $$c_1 = 2$$, $$c_2 = \pi$$, $$c_3 = 4 \pi / 3$$.

For $$t \ge 0$$, let $$M_t = N(B_t)$$, the number of random points in the ball $$B_t$$, or equivalently, the number of random points within distance $$t$$ of the origin. From our formula for the measure of $$B_t$$ above, it follows that $$M_t$$ has the Poisson distribution with parameter $$r c_d t^d$$.

Now let $$Z_0 = 0$$ and for $$n \in \N_+$$ let $$Z_n$$ denote the distance of the $$n$$th closest random point to the origin. Note that $$Z_n$$ is analogous to the $$n$$th arrival time for the Poisson process on $$[0, \infty)$$. Clearly the processes $$\bs{M} = (M_t: t \ge 0)$$ and $$\bs{Z} = (Z_0, Z_1, \ldots)$$ are inverses of each other in the sense that $$Z_n \le t$$ if and only if $$M_t \ge n$$. Both of these events mean that there are at least $$n$$ random points within distance $$t$$ of the origin.

Distributions

1. $$c_d Z_n^d$$ has the gamma distribution with shape parameter $$n$$ and rate parameter $$r$$.
2. $$Z_n$$ has probability density function $$g_n$$ given by $g_n(z) = \frac{d \left(c_d r\right)^n z^{n d - 1}}{(n-1)!} \exp\left(-r c_d z^d\right), \quad 0 \le z \lt \infty$
Proof

Let $$T_n = c_d Z_n^d$$.

1. From the inverse relationship above, $\P(T_n \le t) = \P\left[Z_n \le \left(t / c_d\right)^{1/d}\right] = \P\left\{M\left[\left(t / c_d\right)^{1/d}\right] \ge n\right\}$ But $$M\left[\left(t / c_d\right)^{1/d}\right]$$ has the Poisson distribution with parameter $$r c_d \left[\left(t / c_d\right)^{1/d}\right]^d = r t$$ so $\P(T_n \le t) = \sum_{k=n}^\infty e^{-r t} \frac{(r t)^k}{k!}$ which we know is the gamma CDF with parameters $$n$$ and $$r$$
2. Let $$f_n$$ denote the gamma PDF with parameters $$n$$ and $$r$$ and let $$t = c_d z^d$$. From the standard change of variables formula, $g_n(z) = f_n(t) \frac{dt}{dz}$ Substituting and simplifying gives the result.

$$c_d Z_n^d - c_d Z_{n-1}^d$$ are independent for $$n \in \N_+$$ and each has the exponential distribution with rate parameter $$r$$.

## Computational Exercises

Suppose that defects in a sheet of material follow the Poisson model with an average of 1 defect per 2 square meters. Consider a 5 square meter sheet of material.

1. Find the probability that there will be at least 3 defects.
2. Find the mean and standard deviation of the number of defects.
1. 0.4562
2. 2.5, 1.581

Suppose that raisins in a cake follow the Poisson model with an average of 2 raisins per cubic inch. Consider a slab of cake that measures 3 by 4 by 1 inches.

1. Find the probability that there will be at no more than 20 raisins.
2. Find the mean and standard deviation of the number of raisins.
1. 0.2426
2. 24, 4.899

Suppose that the occurrence of trees in a forest of a certain type that exceed a certain critical size follows the Poisson model. In a one-half square mile region of the forest there are 40 trees that exceed the specified size.

1. Estimate the density parameter.
2. Using the estimated density parameter, find the probability of finding at least 100 trees that exceed the specified size in a square mile region of the forest
1. $$r = 80$$ per square mile
2. 0.0171

Suppose that defects in a type of material follow the Poisson model. It is known that a square sheet with side length 2 meters contains one defect. Find the probability that the defect is in a circular region of the material with radius $$\frac{1}{4}$$ meter.

0.0491

Suppose that raisins in a cake follow the Poisson model. A 6 cubic inch piece of the cake with 20 raisins is divided into 3 equal parts. Find the probability that each piece has at least 6 raisins.

0.2146

Suppose that defects in a sheet of material follow the Poisson model, with an average of 5 defects per square meter. Each defect, independently of the others is mild with probability 0.5, moderate with probability 0.3, or severe with probability 0.2. Consider a circular piece of the material with radius 1 meter.

1. Give the mean and standard deviation of the number of defects of each type in the piece.
2. Find the probability that there will be at least 2 defects of each type in the piece.