Problem Sheet 8
You should attempt all these questions and write up your solutions in advance of your workshop in week 9 (Monday 22 or Tuesday 23 March) where the answers will be discussed.
1.Let \((X(t))\) be a Poisson process with rate \(\lambda\).
(a) Fix \(n\). What is the expected time between the \(n\)th arrival and the \((n+1)\)th arrival?
Solution. The waiting time is \(\operatorname{Exp}(\lambda)\) with mean \(1/\lambda\).
(b) Fix \(t\). What is the expected time between the previous arrival before \(t\) and the next arrival after \(t\)?
Solution. The time between the arrivals is \(S + T\), where \(S\) is the time backwards from \(t\) since the previous arrival, and \(T\) is the time forwards from \(t\) until the next arrival. By the memoryless property of the exponential distribution, \(T \sim \operatorname{Exp}(\lambda)\), with mean \(1/\lambda\). What about \(S\)? We have \[ \mathbb P(S > s) = \mathbb P\big(\text{no arrivals in }[t-s,t]\big) = \mathrm{e}^{-\lambda s} \frac{(\lambda s)^0}{0!} = \mathrm{e}^{-\lambda s} , \] which is itself is the tail probability of another \(\operatorname{Exp}(\lambda)\) distribution, so \(S \sim \operatorname{Exp}(\lambda)\) with mean \(1/\lambda\) too. Hence \[ \mathbb E(S + T) = \frac1\lambda + \frac1\lambda = \frac2\lambda .\]
(c) Your answers to the previous two questions should be different. Explain why one should expect the second answer to be bigger than the first.
Solution. Imagine the arrivals being placed on the real line – some with big gaps between them, and some with small gaps between them. If we then place the time \(t\), it is more likely to be placed in one of the large gaps than one of the small ones, because each large gap fills up more of the line than a small one. Hence the size of the gap surrounding the given point \(t\) is “size-biased”, and is larger than a uniformly randomly chosen gap.
2. Let \(X(t)\) be a Poisson process with rate \(\lambda\), and mark each arrival independently with probability \(p\). Use the infinitesimals definition to show that the marked process is a Poisson process with rate \(p\lambda\).
Solution. Let \(Y(t)\) be the marked process Consider an infinitesimal increment \(Y(t + \tau) - Y(t)\) as \(\tau \to 0\). We have one arrival in \((Y(t))\) if there was one arrival for \((X(t))\) and it was marked – all other possibilities rely on extra arrivals in \((X(t))\), so have lower order probability – giving \[ \mathbb P \big( Y(t + \tau) - Y(t) = 1\big) = p\lambda \tau + o (\tau) . \] We have no arrivals in \((Y(t))\) if there were no arrivals in \((X(t))\), there was one arrival that was unmarked, or lower order terms, giving \[ \mathbb P \big( Y(t + \tau) - Y(t) = 0 \big) = 1 - \lambda \tau + (1-p)\lambda \tau + o(\tau) = 1 - p\lambda \tau + o(\tau) . \] Larger increments in \((Y(t))\) require larger increments in \((X(t))\), so \[ \mathbb P \big( Y(t + \tau) - Y(t) \geq 2 \big) = o(\tau) . \] Since we have \(Y(0) = 0\) and independent increments, we have written down precisely the infinitesimal increments definition of a Poisson process with rate \(p\lambda\).
3. Let \((X(t))\) be a simple birth process with rates \(\lambda_j = \lambda j\) starting from \(X(0)=1\). Let \(p_j(t) = \mathbb P(X(t) = j)\).
(a) Write down the Kolmogorov forward equations for \(p_j(t)\). You should have separate equations for \(j = 1\) and \(j \geq 2\). Remember to include the initial conditions \(p_j(0)\).
Solution. From Section 16.1 of the notes, and putting \(\lambda_j = \lambda j\), the forward equations are \[\begin{align*} j &= 1\colon & p'_1(t) &= -\lambda p_1(t) & p_1(0) &= 1 \\ j &\geq 2 \colon & p'_j(t) &= -\lambda j p_j(t) + \lambda (j-1)p_{j-1}(t) & p_j(0) &= 0 \end{align*}\]
(b) Show that \(X(t)\) follows a geometric distribution \(X(t) \sim \text{Geom} (\mathrm{e}^{-\lambda t})\). That is, show that \[ p_j(t) = \big(1 - \mathrm{e}^{-\lambda t}\big)^{j-1}\mathrm{e}^{-\lambda t} \] satisfies the forward equation.
Solution. For \(j = 1\), we have \(p_1(t) = \mathrm{e}^{-\lambda t}\). This clearly has \(p_1(0) = \mathrm{e}^0 = 1\), and \(p'_1(t) = -\lambda \mathrm{e}^{-\lambda t} = -\lambda p_1(t)\), as required.
Now consider \(j \geq 2\). We indeed have \(p_j(0) = 0\). By the product rule, the derivative on the left-hand side is \[\begin{align*} p'_j(t) &= -\lambda \big(1 - \mathrm{e}^{-\lambda t}\big)^{j-1}\mathrm{e}^{-\lambda t} + \lambda \mathrm{e}^{-\lambda t} (j-1) \big(1 - \mathrm{e}^{-\lambda t}\big)^{j-2}\mathrm{e}^{-\lambda t} , \\ &= -\lambda \big(1 - \mathrm{e}^{-\lambda t}\big)^{j-2}\mathrm{e}^{-\lambda t} \big(1 - \mathrm{e}^{-\lambda t} -\mathrm{e}^{-\lambda t} (j-1) \big) \\ &=- \lambda \big(1 - \mathrm{e}^{-\lambda t}\big)^{j-2}\mathrm{e}^{-\lambda t} \big(1 - j \mathrm{e}^{-\lambda t} \big). \end{align*}\] The right-hand side is \[\begin{align*} -\lambda j p_j(t) + {}&\lambda (j-1)p_{j-1}(t) \\ &= - \lambda j \big(1 - \mathrm{e}^{-\lambda t}\big)^{j-1}\mathrm{e}^{-\lambda t} + \lambda (j-1) \big(1 - \mathrm{e}^{-\lambda t}\big)^{j-2}\mathrm{e}^{-\lambda t} \\ &= -\lambda\big(1 - \mathrm{e}^{-\lambda t}\big)^{j-2}\mathrm{e}^{-\lambda t}\big(j(1 - \mathrm{e}^{-\lambda t}) - (j-1)\big) \\ &= - \lambda \big(1 - \mathrm{e}^{-\lambda t}\big)^{j-2}\mathrm{e}^{-\lambda t} \big(1 - j \mathrm{e}^{-\lambda t} \big) . \end{align*}\] These are indeed equal.
(c) Hence, calculate \(\mathbb E X(t)\), the expected population size at time \(t\).
Solution. The expectation of a \(\text{Geom}(\theta)\) random variable is \(1/\theta\), so \(\mathbb EX(t) = 1/\mathrm{e}^{-\lambda t} = \mathrm{e}^{\lambda t}\).
4. Let \((X(t))\) be a simple birth process with rates \(\lambda_n = \lambda n\) starting from \(X(0)=1\). Let \(T_n \sim \operatorname{Exp}(\lambda n)\) be the \(n\)th holding time, and let \(J_n = T_1 + T_2 +\cdots+T_n\) be the time of the \(n\)th birth.
(a) Write down \(\mathbb E T_n\) and \(\operatorname{Var}(T_n)\).
Solution. By standard results about the exponential distribution, \(\mathbb E T_n = 1/\lambda n\) and \(\operatorname{Var}(T_n) = 1/(\lambda n)^2 = 1/\lambda^2 n^2\).
(b) Show that, as \(n \to \infty\), the expectation \(\mathbb E J_n\) tends to infinity, but the variance \(\operatorname{Var}(J_n)\) is bounded.
Solution. By linearity of expectation, \[ \mathbb E J_n = \sum_{j=1}^n \mathbb E T_j = \sum_{j=1}^n \frac{1}{\lambda j} = \frac{1}{\lambda} \sum_{j=1}^n \frac{1}j . \] Since the holding times are independent, \[ \operatorname{Var}(J_n) = \sum_{j=1}^n \operatorname{Var}(T_j) = \sum_{j=1}^n \frac{1}{\lambda^2 j^2} = \frac{1}{\lambda^2} \sum_{j=1}^n \frac{1}{j^2} . \]
The harmonic series \(\sum_j 1/j\) diverges, so \(\mathbb E J_n\) tends to infinity. The series \(\sum_j 1/j^2\), on the other hand, converges (to \(\pi^2/6\), although you don’t need to know that), so \(\operatorname{Var}(J_n)\) is bounded (by \(\pi^2/6\lambda^2\)).
5. The number of phonecalls my office receives in a three hour period is modelled as a time inhomogeneous Poisson process with rate function \[ \lambda(t) = \begin{cases} 3t & 0 \leq t < 1 \\ 3 & 1 \leq t < 2 \\ 9 - 3t & 2 \leq t \leq 3 \end{cases}. \]
(a) Calculate the probability I receive exactly one phonecall (i) in the first hour; (ii) in the second hour; (iii) in the third hour.
Solution. (i) The number of phonecalls in the first hour is Poisson with rate \[ \int_0^1 3t\,\mathrm dt = \left[\tfrac32 t^2 \right]_0^1 = \tfrac32 . \] The probability there is exactly one phonecall in this time is \(\frac32 \mathrm{e}^{-3/2} = 0.335\).
The number of phonecalls in the second hour is Poisson with rate \(3\). The probability there is exactly one phonecall in this time is \(3 \mathrm{e}^{-3} = 0.149\).
The number of phonecalls in the first hour is Poisson with rate \[ \int_2^3 (9-3t)\,\mathrm dt = \left[9t - \tfrac32 t^2 \right]_2^3 = 3 - \tfrac32 = \tfrac32 . \] The probability there is exactly one phonecall in this time is \(\frac32 \mathrm{e}^{-3/2} = 0.335\).
(b) Calculate the probability I receive exactly \(3\) phonecalls over the three hour period.
Solution. The number of phonecalls over the three hour period is Poisson with rate \[ \int_0^3 \lambda(t)\,\mathrm dt = \tfrac32 + 3 + \tfrac32 = 6 . \] The probability there are exactly three calls is \[ \frac{6^3}{3!} \mathrm{e}^{-6} = 36 \mathrm{e}^{-6} = 0.089. \]