Poisson processes model random events occurring over time or space, and they're one of the most important tools in actuarial mathematics. They give actuaries a rigorous framework for predicting claim frequencies, calculating premiums, and managing reserves. By understanding how events arrive and how long you wait between them, you can build models that drive real pricing and risk decisions.
Poisson process fundamentals
A Poisson process is a counting process that tracks how many events occur over a given interval of time (or space). It's built on a small set of assumptions that make the math tractable while still capturing the behavior of many real-world phenomena.
Definition of Poisson process
The process is defined by a single rate parameter , which represents the average number of events per unit time. If you observe the process over an interval of length , the number of events follows a Poisson distribution with mean :
A key structural feature: the number of events in non-overlapping intervals are independent random variables. So what happens between time 0 and time 5 tells you nothing about what happens between time 5 and time 10.
Assumptions and properties
A counting process qualifies as a Poisson process when it satisfies three conditions:
- No simultaneous events. Events occur one at a time. The probability of two or more events in an infinitesimally small interval is essentially zero.
- Independent increments. The number of events in non-overlapping time intervals are independent of each other.
- Rate proportionality. The probability of exactly one event in a tiny interval of length is approximately .
The process also has stationary increments: the distribution of the number of events in any interval depends only on the interval's length, not on where it sits on the time axis. An interval from to has the same distribution as one from to .
Memoryless property
The memoryless property means the future of the process doesn't depend on its past. Formally:
In practical terms, if you're waiting for the next insurance claim and 3 days have already passed, the expected additional waiting time is the same as it was at the start. The process doesn't "remember" how long it's been. This property flows directly from the independent and stationary increments assumptions.
Relationship to exponential distribution
The time between consecutive events (the inter-arrival time) in a Poisson process follows an exponential distribution with rate . Its density is:
This connection is fundamental. The Poisson process and the exponential distribution are two sides of the same coin: one describes the count of events in an interval, the other describes the waiting time between events. Many derivations in actuarial math exploit this duality.
Arrival times in Poisson processes
Understanding when events occur, not just how many occur, is central to actuarial modeling. Arrival time analysis lets you estimate expected waiting times between claims and forecast when future events are likely.
Inter-arrival times
Inter-arrival times are the gaps between consecutive events. In a homogeneous Poisson process with rate :
- The inter-arrival times are independent and identically distributed (i.i.d.) exponential random variables with rate .
- The mean inter-arrival time is . If claims arrive at a rate of 5 per month, the average time between claims is months, or about 6 days.
- The variance of each inter-arrival time is .
Exponential distribution of inter-arrival times
The full distributional details for an inter-arrival time :
- PDF: for
- CDF: for
- Survival function:
The memoryless property of the exponential distribution states that . If you've already waited units without an event, the remaining wait time has the same distribution as if you'd just started waiting. The exponential distribution is the only continuous distribution with this property.
Probability of arrivals in time intervals
The number of events in any interval of length follows the Poisson PMF:
- Expected count:
- Variance:
The fact that the mean equals the variance is a signature property of the Poisson distribution. If you observe data where the variance significantly exceeds the mean (overdispersion), a standard Poisson process may not be the right model.
Conditional probabilities of arrivals
Because of independent increments, conditioning on the past doesn't change the distribution of future counts:
The number of events in is independent of whatever happened in . This simplifies conditional calculations enormously and is one reason Poisson processes are so analytically convenient.
Poisson process applications
Modeling rare events
Poisson processes are a natural fit for rare events: natural disasters, industrial accidents, extreme financial losses. The key requirements are met when events occur independently, one at a time, and at a roughly constant rate. For example, if a region experiences an average of 2.3 significant earthquakes per year, you can model the count of earthquakes in any time window using a Poisson distribution with per year.
Insurance claims modeling
This is the bread-and-butter actuarial application. The standard approach separates frequency from severity:
- Frequency: The number of claims per period is modeled as a Poisson random variable. The rate is estimated from historical claims data.
- Severity: The size of each individual claim is modeled separately, often using lognormal, Pareto, or gamma distributions.
For example, a large auto insurer might observe an average of 850 claims per month across a portfolio. The monthly claim count would be modeled as , with individual claim amounts modeled independently.
Queueing theory applications
Poisson arrivals are a foundational assumption in queueing theory. The classic M/M/1 queue assumes Poisson arrivals (rate ) and exponential service times (rate ). From these assumptions, you can derive:
- Average number of customers in the system:
- Average waiting time:
These results apply to call centers, hospital emergency departments, and any system where customers arrive randomly and wait for service.

Reliability engineering
When modeling component failures, the time between failures is often assumed to follow an exponential distribution, which corresponds to a Poisson process for the failure count. The mean time between failures (MTBF) is . If a machine fails on average once every 500 hours, per hour, and you can calculate the probability of surviving any given operating period using the exponential survival function.
Poisson process variations
The standard homogeneous Poisson process is elegant but restrictive. Several generalizations relax its assumptions to handle more realistic scenarios.
Non-homogeneous Poisson processes
A non-homogeneous Poisson process (NHPP) allows the rate to vary over time through an intensity function . The expected number of events in the interval is:
NHPPs are useful when event rates change predictably. A retail store might see customer arrivals at a rate of 20 per hour during peak times and 5 per hour during off-peak times. The process still has independent increments, but the increments are no longer stationary.
Compound Poisson processes
A compound Poisson process attaches a random "size" to each event. If is a Poisson process and are i.i.d. random variables representing event sizes, the aggregate process is:
This is the standard model for aggregate claims in insurance. The number of claims follows a Poisson process, and each claim has a random dollar amount. The mean and variance of are:
Mixed Poisson processes
In a mixed Poisson process, the rate is itself a random variable drawn from some mixing distribution. This captures heterogeneity across a population. For auto insurance, different drivers have genuinely different risk levels. If follows a gamma distribution, the resulting marginal distribution of claim counts is negative binomial, which naturally accommodates overdispersion.
Doubly stochastic Poisson processes
Also called Cox processes, these allow to be a stochastic process rather than a deterministic function or a single random variable. The event process is Poisson conditional on the realized path of . Cox processes are used in credit risk modeling, where default intensities fluctuate with economic conditions, and in seismology, where earthquake rates vary unpredictably.
Estimating Poisson process parameters
Maximum likelihood estimation
For a homogeneous Poisson process, the MLE of is straightforward:
If you observe 47 claims over 12 months, claims per month. MLE is asymptotically unbiased and achieves the lowest possible variance among consistent estimators (it's asymptotically efficient).
Method of moments
The method of moments equates sample moments to theoretical moments and solves for the parameters. For a Poisson process, since both the mean and variance equal , the estimator is the sample mean of event counts per unit time. This gives the same result as MLE for the homogeneous case, but method of moments can be less efficient for more complex models or small samples.
Bayesian estimation
Bayesian estimation combines a prior distribution for with observed data to produce a posterior distribution. A common choice is a gamma prior, because it's conjugate to the Poisson likelihood. If the prior is and you observe events in time , the posterior is:
The posterior mean is , which is a weighted average of the prior mean and the MLE. Bayesian estimation is especially valuable when data is sparse and you have credible prior information.
Confidence intervals for parameters
For large samples, an approximate confidence interval for uses the normal approximation:
where is the total observation time. For exact intervals, you can use the relationship between the Poisson distribution and the chi-square distribution. If events are observed, an exact confidence interval for is:
Divide by to get the interval for .
Poisson process simulation
Simulation is essential for studying process behavior, testing statistical methods, and generating scenarios for risk analysis.
Generating Poisson random variables
To generate a Poisson random variable with mean :
- Inverse transform method: Generate uniform random numbers and use the Poisson CDF to map them to counts. Accumulate probabilities until the cumulative probability exceeds the uniform draw.
- For large : Use the normal approximation where , or use more sophisticated algorithms like the one by Ahrens and Dieter.

Simulating arrival times
The most direct way to simulate a Poisson process is through its inter-arrival times:
- Generate i.i.d. exponential random variables with rate . Each can be generated as where .
- Compute arrival times as cumulative sums: .
- Stop when exceeds the desired time horizon.
For non-homogeneous processes, the thinning algorithm is commonly used:
- Find an upper bound for all .
- Simulate a homogeneous Poisson process with rate .
- Accept each event at time with probability ; otherwise discard it.
Monte Carlo methods
Monte Carlo estimation works by simulating many independent realizations of the process and averaging the results. To estimate the probability that total claims exceed a threshold:
- Simulate realizations of the claim process (frequency and severity).
- For each realization, compute the aggregate loss.
- The proportion of realizations exceeding the threshold estimates the probability.
Accuracy improves with more simulations, with the standard error decreasing proportionally to .
Variance reduction techniques
Standard Monte Carlo can require a very large number of simulations for precise estimates. Variance reduction techniques improve efficiency:
- Antithetic variates: If generates one realization, use to generate a second. The negative correlation between the pair reduces overall variance.
- Control variates: Use a related quantity with a known expected value to adjust estimates.
- Importance sampling: Oversample from regions of the distribution that matter most (e.g., the tail) and reweight accordingly.
- Stratified sampling: Divide the probability space into strata and sample from each, ensuring better coverage.
Advanced topics in Poisson processes
Superposition of Poisson processes
If you combine (superpose) independent Poisson processes with rates , the result is a Poisson process with rate:
This is useful when events come from multiple independent sources. An insurance company receiving claims from three independent product lines with rates 10, 15, and 8 per month can model the combined stream as a single Poisson process with rate 33 per month.
Thinning of Poisson processes
Thinning is the reverse of superposition. Starting from a Poisson process with rate , independently keep each event with probability and discard it with probability . The kept events form a Poisson process with rate , and the discarded events form an independent Poisson process with rate .
For example, if defective items occur at rate and each defective item is caught by inspection with probability , the detected defects follow a Poisson process with rate .
Poisson process transformations
Poisson processes can be transformed in several ways:
- Time scaling: Replacing with in a Poisson process with rate yields a Poisson process with rate . This is useful for converting between time units.
- Random time change: Substituting a random process for the deterministic time variable creates a new process. If is Poisson and you replace with a subordinator (a non-decreasing random process), the result can model systems with random operating times.
Marked Poisson processes
A marked Poisson process associates a random mark with each event. The marks can be continuous (claim amounts), discrete (claim types), or even multivariate (location and severity together). The event times and marks are typically assumed independent.
Marked Poisson processes generalize compound Poisson processes and are the natural framework for modeling heterogeneous event streams. In catastrophe modeling, each event might carry marks for location, magnitude, and insured loss.
Poisson processes in actuarial applications
Pricing insurance contracts
The standard actuarial pricing framework uses the frequency-severity approach:
- Model claim frequency with a Poisson process (rate ).
- Model claim severity with a separate distribution (lognormal, Pareto, gamma, etc.).
- Compute the expected aggregate loss: per unit time.
- Add loadings for expenses, profit margin, and risk to arrive at the premium.
For a motor insurance portfolio with claims per policy per year and average claim size of $4,200, the pure premium per policy is per year.
Ruin theory and Poisson processes
The Cramér-Lundberg model is the classical ruin theory framework. An insurer starts with initial surplus , collects premiums at rate per unit time, and pays claims that arrive as a compound Poisson process. The surplus at time is:
where is the aggregate claims process. Ruin occurs if for any . The probability of ruin depends on the initial surplus, the premium rate, the claim frequency , and the claim size distribution. For exponentially distributed claims with mean , the ruin probability has a closed-form solution:
provided (premiums exceed expected claims).
Reinsurance modeling
Reinsurance transfers part of the risk from a primary insurer (cedent) to a reinsurer. Poisson process models help evaluate different treaty structures:
- Quota share: The reinsurer takes a fixed proportion of every claim. The cedent's retained process is a compound Poisson process with the same frequency but scaled severity .
- Excess-of-loss: The reinsurer pays the portion of each claim exceeding a retention . The reinsurer's process involves only claims where , which by thinning is itself a Poisson process.
- Stop-loss: The reinsurer covers aggregate losses exceeding a threshold. This requires the full aggregate loss distribution, typically computed via the compound Poisson model.
Risk management with Poisson processes
Actuaries use Poisson-based aggregate loss models to compute key risk measures:
- Value-at-Risk (VaR): The loss amount at a specified quantile (e.g., 99.5%). If the aggregate loss distribution gives , then .
- Tail Value-at-Risk (TVaR): The expected loss given that the loss exceeds VaR. TVaR captures the severity of tail events, not just their threshold.
These measures drive capital requirements under regulatory frameworks like Solvency II. The aggregate loss distribution is typically computed using Panjer's recursion, FFT methods, or Monte Carlo simulation from the underlying compound Poisson model.