Section 13 Poisson process with Poisson increments

  • Reminder: the Poisson distribution
  • The Poisson process has independent Poisson increments
  • Summed and marked Poisson processes

13.1 Poisson distribution

In the next three sections we’ll be considering the Poisson process, a continuous time discrete space process with the Markov property. Given its name, it’s not surprising to hear the Poisson process is related to the Poisson distribution. Let’s start with a reminder of that.

Recall that a discrete random variable \(X\) has a Poisson distribution with rate \(\lambda\), written \(X \sim \operatorname{Po}(\lambda)\), if its probability mass function is \[ \mathbb P(X = n) = \mathrm{e}^{-\lambda} \frac{\lambda^n}{n!} \qquad n = 0,1,2,\dots. \]

The Poisson distribution is often used to model the number of “arrivals” in a fixed amount of time – for example, the number of calls to a call centre in one hour, the number of claims to an insurance company in one year, or the number of particles decaying from a large amount of radioactive material in one second.

The Poisson distribution is named after the French mathematician and physicist Siméon Denis Poisson, who studied it in 1837, although it was used by another French mathematician, Abraham de Moivre, more than 100 years earlier.

Recall the following facts about of the Poisson distribution:

  1. Its expectation is \(\mathbb EX = \lambda\) and the variance is \(\operatorname{Var}(X) = \lambda\).
  2. If \(X \sim \operatorname{Po}(\lambda)\) and \(Y \sim \operatorname{Po}(\mu)\) are independent, then \(X + Y \sim \operatorname{Po}(\lambda+\mu)\).
  3. Let \(X \sim \operatorname{Po}(\lambda)\) represent some arrivals, and independently “mark” each arrival with probability \(p\). Then the number of marked arrivals \(Y\) has distribution \(Y = \operatorname{Po}(p\lambda)\), the number of unmarked arrivals \(Z\) has distribution \(Z \sim \operatorname{Po}((1-p)\lambda)\), and \(Y\) and \(Z\) are independent.

(I say “recall”, but it’s OK if the third one is new to you.)

13.2 Definition 1: Poisson increments

Suppose that, instead of just modelling the number of arrivals in a fixed amount of time, we want to continually model the total number of arrivals as it changes over time. This will be a stochastic process with discrete state space \(\mathcal S = \mathbb Z_+ = \{0,1,2,\dots\}\) and continuous time \(\mathbb R_+ = [0,\infty)\). In continuous time, we will normally write stochastic processes as \((X(t))\), with the time variable being a \(t\) in brackets, rather than a subscript \(n\) as we had in discrete time.

Suppose calls arrive at a call centre at a rate of \(\lambda = 100\) an hour. The following assumptions seem reasonable:

  • We begin counting with \(X(0) = 0\) calls.
  • The number of calls in the first hour is \(X(1) \sim \operatorname{Po}(100)\). The number of calls in the second hour \(X(2) - X(1)\) will also be \(\operatorname{Po}(100)\), and will be independent of the number of calls in the first hour.
  • The number of calls in a two hour period will be \(X(t+2) - X(t) \sim \operatorname{Po}(200)\), while the number of calls in a half-hour period will be \(X(t+\frac12) - X(t) \sim \operatorname{Po}(50)\).

These properties will define the Poisson process.

Definition 13.1 The Poisson process with rate \(\lambda\) is defined as followed. It is a stochastic process \((X(t))\) with continuous time \(t \in [0,\infty)\) and discrete state space \(\mathcal S = \mathbb Z_+\) with the following properties:

  1. \(X(0) = 0\);
  2. Poisson increments: \(X(t+s) - X(t) \sim \operatorname{Po}(\lambda s)\) for all \(s,t>0\);
  3. independent increments: \(X(t_2) - X(t_1)\) and \(X(t_4) - X(t_3)\) are independent for all \(t_1 \leq t_2 \leq t_3 \leq t_4\).

Note that the condition \(t_1 \leq t_2 \leq t_3 \leq t_4\) means that the time interval from \(t_1\) to \(t_2\) and the time interval from \(t_3\) to \(t_4\) don’t overlap. (Overlapping time intervals will not have independent increments, as arrivals in the overlap will count for both.)

The Poisson process was discovered in the first decade 20th century, and the process was named after the distribution. Many people were working on similar things, so it’s difficult to say who discovered it first, but important work was done by the Swedish actuary and mathematician Filip Lundberg and the Danish engineer and mathematician AK Erlang.

Example 13.1 Claims arrive at insurance company at a rate of \(\lambda = 8\) per hour, modelled as a Poisson process. What is the probability there are no claims in a given \(15\) minute period?

By property 2, the number of claims in \(15\) minutes is a Poisson distribution with mean \(\frac{15}{60}\lambda = 2\). The probability there are no claims is \[ \mathrm{e}^{-2} \frac{2^0}{0!} = \mathrm{e}^{-2} = 0.135 . \]

Example 13.2 A professor receives visitors to her office at a rate of \(\lambda = 2.5\) per day, modelled as a Poisson process. What is the probability she gets at least one visitor every day this (5-day) week?

The probability she gets at least one visitor on any given day is \[ 1 - \mathrm{e}^{-2.5} \frac{2.5^0}{0!} = 1 - \mathrm{e}^{-2.5} = 0.918. \] By property 3, the numbers of visitors on different days are independent, so the probability of getting at least one visitor each day this week is \(0.918^5 = 0.652\).

13.3 Summed and marked Poisson processes

The following theorem shows that the sum of two Poisson processes is itself a Poisson process.

Theorem 13.1 Let \((X(t))\) and \((Y(t))\) be independent Poisson processes with rates \(\lambda\) and \(\mu\) respectively. Then the process \((Z(t))\) given by \(Z(t) = X(t) + Y(t)\) is a Poisson process with rate \(\lambda+\mu\).

Proof. The proof of this is a question on Problem Sheet 7.

Example 13.3 A student receives email to her university mail address at a rate of \(\lambda = 4\) emails per hour, and to her personal email address at a rate of \(\mu = 2\) per hour. Using a Poisson process model, what is the probability the student receives 3 or fewer emails in a 30 minute period?

The total number of emails is a sum of Poisson processes with rate \(\lambda + \mu = 6\). The total number of emails received in half an hour is Poisson with rate \((\lambda + \mu)/2 = 3\). Thus the probability that 3 or fewer emails are received is \[ \mathrm{e}^{-3} \frac{3^0}{0!} + \mathrm{e}^{-3} \frac{3^1}{1!} + \mathrm{e}^{-3} \frac{3^2}{2!} + \mathrm{e}^{-3} \frac{3^3}{3!} = 13\mathrm{e}^{-3} = 0.647 . \]

We also have the marked Poisson process, which can be thought of as the opposite to the summed process: the summed process combines two processes together, while the marked process splits one process into two.

Theorem 13.2 Let \((X(t))\) be a Poisson process with rate \(\lambda\). Each arrival is independently marked with probability \(p\). Then the marked process \((Y(t))\) is a Poisson process with rate \(p\lambda\), the unmarked process \((Z(t))\) is a Poisson process with rate \((1-p)\lambda\), and \((Y(t))\) and \((Z(t))\) are independent.

Proof. Given the third fact that you were “reminded” of, it’s easy to check the necessary properties.

Example 13.4 In the 2019/20 English Premier League football season, an average of \(\lambda = 2.72\) goals were scored per game, with a proportion \(p = 0.56\) of them scored by the home team. If we model this as a Poisson process, what is the probability a match ends in a 1–1 draw?

The number of home goals scored is Poisson with rate \(p\lambda = 0.56\times 2.72 = 1.52\), and the number of away goals scored is Poisson with rate \((1-p)\lambda = 1.20\). Under the Poisson process assumption, these are independent, so the probability the home and the away team both score \(1\) goal is \[ \mathrm{e}^{-1.52} \frac{1.52^1}{1!} \times \mathrm{e}^{-1.20} \frac{1.20^1}{1!} = 1.52\mathrm{e}^{-1.52} \times 1.20 \mathrm{e}^{-1.20} = 0.12, \] or 12%.

In the next section, we look at the Poisson process in a different way: the times in between arrivals have an exponential distribution.