Poisson Process

Experiment 1: Emergence of the Poisson Process from Bernoulli Trials

1.1 Overview

The Poisson distribution arises naturally as a limit of the Binomial distribution when the number of trials becomes large and the success probability becomes small, such that the expected number of successes remains constant. This is a key idea in stochastic processes and models real-world events such as radioactive decay, call arrivals, and queue arrivals.

1.2 Formal Setting

Let XnBin(n,p)X_n \sim \text{Bin}(n, p) denote a Binomial random variable, where:

  • nn is the number of independent Bernoulli trials.
  • pp is the probability of success.
  • λ=np\lambda = np is the expected number of successes.

We are interested in the behavior of XnX_n as nn \to \infty, p0p \to 0, such that λ=np\lambda = np is held constant.

1.3 Poisson Limit Theorem (Rigorous Proof)

Let XnBin(n,λ/n)X_n \sim \text{Bin}(n, \lambda/n). Then:

limnP(Xn=k)=eλλkk!,for all kN0. \lim_{n \to \infty} \mathbb{P}(X_n = k) = \frac{e^{-\lambda} \lambda^k}{k!}, \quad \text{for all } k \in \mathbb{N}_0.

Proof:

P(Xn=k)=(nk)(λn)k(1λn)nk=n(n1)(nk+1)k!(λknk)(1λn)n(1λn)k. \begin{aligned} \mathbb{P}(X_n = k) &= \binom{n}{k} \left(\frac{\lambda}{n}\right)^k \left(1 - \frac{\lambda}{n}\right)^{n - k} \\ &= \frac{n(n-1)\dots(n-k+1)}{k!} \cdot \left(\frac{\lambda^k}{n^k}\right) \cdot \left(1 - \frac{\lambda}{n}\right)^n \cdot \left(1 - \frac{\lambda}{n}\right)^{-k}. \end{aligned}

Using the limits:

  • n(n1)(nk+1)nk1\frac{n(n-1)\dots(n-k+1)}{n^k} \to 1,
  • (1λn)neλ\left(1 - \frac{\lambda}{n}\right)^n \to e^{-\lambda},
  • (1λn)k1\left(1 - \frac{\lambda}{n}\right)^{-k} \to 1,

the expression converges to:

λkeλk!. \frac{\lambda^k e^{-\lambda}}{k!}.

This proves convergence in distribution to the Poisson distribution.


Experiment 2: Merging and Splitting of Poisson Processes

2.1 Merging of Independent Poisson Processes

Let N1(t)Poisson(λ1t)N_1(t) \sim \text{Poisson}(\lambda_1 t), N2(t)Poisson(λ2t)N_2(t) \sim \text{Poisson}(\lambda_2 t), and assume independence. Define N(t)=N1(t)+N2(t)N(t) = N_1(t) + N_2(t).

Theorem: Superposition

N(t)Poisson((λ1+λ2)t)N(t) \sim \text{Poisson}((\lambda_1 + \lambda_2)t).

Proof: Let XPoisson(λ1t),YPoisson(λ2t)X \sim \text{Poisson}(\lambda_1 t), Y \sim \text{Poisson}(\lambda_2 t), then:

P(X+Y=k)=i=0kP(X=i)P(Y=ki) \mathbb{P}(X + Y = k) = \sum_{i=0}^{k} \mathbb{P}(X = i) \mathbb{P}(Y = k - i)

=e(λ1+λ2)ti=0k(λ1t)ii!(λ2t)ki(ki)! = e^{-(\lambda_1 + \lambda_2)t} \sum_{i=0}^k \frac{(\lambda_1 t)^i}{i!} \cdot \frac{(\lambda_2 t)^{k - i}}{(k - i)!}

=e(λ1+λ2)tk!i=0k(ki)(λ1t)i(λ2t)ki=e(λ1+λ2)tk!(λ1t+λ2t)k = \frac{e^{-(\lambda_1 + \lambda_2)t}}{k!} \sum_{i=0}^{k} \binom{k}{i} (\lambda_1 t)^i (\lambda_2 t)^{k - i} = \frac{e^{-(\lambda_1 + \lambda_2)t}}{k!} (\lambda_1 t + \lambda_2 t)^k

=(λ1+λ2)ktkk!e(λ1+λ2)t. = \frac{(\lambda_1 + \lambda_2)^k t^k}{k!} e^{-(\lambda_1 + \lambda_2)t}.

This is the PMF of a Poisson((λ1+λ2)t)\text{Poisson}((\lambda_1 + \lambda_2)t) process.

2.2 Splitting (Thinning) of Poisson Process

Let N(t)Poisson(λt)N(t) \sim \text{Poisson}(\lambda t), and each event is assigned independently with probability pp to stream 1, and with 1p1 - p to stream 2.

Define N1(t),N2(t)N_1(t), N_2(t) to be the number of events in each stream.

Theorem: Thinning

N1(t)Poisson(pλt),N2(t)Poisson((1p)λt)N_1(t) \sim \text{Poisson}(p \lambda t), \quad N_2(t) \sim \text{Poisson}((1 - p)\lambda t), and N1,N2N_1, N_2 are independent.

Proof: Given N(t)=nN(t) = n, the number of events in stream 1 is Bin(n,p)\text{Bin}(n, p).

By total probability:

P(N1(t)=k)=n=kP(N(t)=n)(nk)pk(1p)nk \mathbb{P}(N_1(t) = k) = \sum_{n = k}^{\infty} \mathbb{P}(N(t) = n) \binom{n}{k} p^k (1 - p)^{n - k}

=n=k(λt)neλtn!(nk)pk(1p)nk. = \sum_{n = k}^{\infty} \frac{(\lambda t)^n e^{-\lambda t}}{n!} \cdot \binom{n}{k} p^k (1 - p)^{n - k}.

Rewriting:

=(pλt)keλtk!n=k((1p)λt)nk(nk)!=(pλt)keλtk!e(1p)λt=(pλt)kepλtk! = \frac{(p \lambda t)^k e^{-\lambda t}}{k!} \sum_{n = k}^{\infty} \frac{((1 - p) \lambda t)^{n - k}}{(n - k)!} = \frac{(p \lambda t)^k e^{-\lambda t}}{k!} e^{(1 - p) \lambda t} = \frac{(p \lambda t)^k e^{-p \lambda t}}{k!}

This proves N1(t)Poisson(pλt)N_1(t) \sim \text{Poisson}(p \lambda t). Independence follows by construction.

Here please note that in experiments that follow, we will be modifying only \lambda_1 + \lambda_2


Interarrival Times in Poisson Process

Let T1,T2,T_1, T_2, \dots be arrival times in a homogeneous Poisson process with rate λ\lambda. Define interarrival times Sn=TnTn1S_n = T_n - T_{n - 1} (with T0=0T_0 = 0).

Theorem: Exponential Interarrival Times

Each SnExp(λ)S_n \sim \text{Exp}(\lambda), and the SnS_n are i.i.d.

Proof: The probability that no event occurs in [0,t][0, t] is:

P(S1>t)=P(N(t)=0)=eλt \mathbb{P}(S_1 > t) = \mathbb{P}(N(t) = 0) = e^{-\lambda t}

So the CDF of S1S_1 is:

FS1(t)=1eλtfS1(t)=λeλt,t0 F_{S_1}(t) = 1 - e^{-\lambda t} \Rightarrow f_{S_1}(t) = \lambda e^{-\lambda t}, \quad t \ge 0

By the strong Markov property, the process after the first arrival is again a Poisson process (memoryless). Hence, S2,S3,Exp(λ)S_2, S_3, \dots \sim \text{Exp}(\lambda) i.i.d.