The Poisson Distribution
The Poisson distribution models how many times an event occurs within a fixed interval of time or space, given a known average rate. It's one of the most widely used discrete distributions because so many real-world counting problems fit its structure: customer arrivals per hour, typos per page, car accidents per month at an intersection.
Where the Binomial distribution counts successes in a fixed number of trials, the Poisson distribution counts events in a fixed window with no predetermined upper limit. That single difference changes when you reach for each tool.
Definition and Probability Mass Function
A random variable follows a Poisson distribution with parameter if its probability mass function (PMF) is:
- (lambda) is the average rate of occurrence over the interval
- is the number of events you're asking about (0, 1, 2, 3, …)
- is Euler's number (approximately 2.718)
The PMF is defined for all non-negative integers There's no upper cap on , though probabilities become vanishingly small for values far above .
For the PMF to be valid, three conditions must hold:
- Events occur independently of one another.
- The average rate is constant across the interval.
- Two events cannot happen at exactly the same instant (no simultaneous occurrences).
Worked example: A call center receives an average of 3 calls per minute (). What's the probability of receiving exactly 5 calls in a given minute?
So there's roughly a 10.1% chance of exactly 5 calls in that minute.
One useful additive property: if and are independent Poisson random variables with parameters and , then is also Poisson with parameter . This makes it easy to combine or rescale intervals. If you get 3 calls per minute on average, you'd use for a 5-minute window.
Applications
- Queuing systems: customers arriving at a store, calls to a help desk
- Spatial distributions: defects per square meter of fabric, potholes per mile of road
- Rare events: mutations per DNA strand, insurance claims per year
- Natural phenomena: radioactive decays per second, meteor sightings per hour
Parameters of the Poisson Distribution

Mean, Variance, and Shape
The Poisson distribution has a single parameter , and a remarkable feature: the mean and variance are both equal to .
- Expected value:
- Variance:
- Standard deviation:
This mean-equals-variance property is a signature of the Poisson. If you're looking at count data and the sample variance is much larger or smaller than the sample mean, a Poisson model may not be appropriate.
The shape of the distribution depends on :
- Small (say, 1 or 2): the distribution is noticeably right-skewed, with a peak at or near 0.
- Large (say, 20+): the distribution becomes roughly symmetric and starts to resemble a normal distribution.
The mode (most probable value) is the largest integer less than or equal to . When is itself an integer, both and are modes.
Interpreting Lambda
Think of as the answer to: "On average, how many times does this event happen per interval?" That interval could be one hour, one page, one square meter, or whatever unit fits your problem.
A few things to keep in mind:
- must match the interval you're analyzing. If the rate is 6 emails per hour but you're looking at a 10-minute window, use .
- You can estimate from data by taking the sample mean of observed counts.
- Changing shifts and stretches the entire distribution, so getting it right is critical for accurate predictions.
Probabilities and Moments of the Poisson Distribution

Probability Calculations
For a single value, plug directly into the PMF:
For a range of values, use the cumulative distribution function (CDF):
A common trick for "at least" problems: use the complement.
Worked example: A website averages 2 server errors per day (). What's the probability of no errors on a given day?
About a 13.5% chance. The probability of at least one error is , or roughly 86.5%.
Higher Moments
The moment generating function (MGF) is:
From the MGF (or by direct calculation), you can derive:
- Second moment:
- Skewness: (decreases as grows, confirming the distribution becomes more symmetric)
- Excess kurtosis: (approaches 0 for large , matching the normal distribution)
For large or values, calculating the PMF by hand gets tedious. Statistical software, calculator functions, or Poisson tables are the practical way to go.
Poisson vs. Binomial Distributions
When to Use Which
| Feature | Binomial | Poisson |
|---|---|---|
| What it counts | Successes in fixed trials | Events in a fixed interval |
| Upper limit on count | Yes (at most ) | No theoretical upper limit |
| Parameters | (trials) and (success probability) | (average rate) |
| Mean | ||
| Variance | ||
| Mean vs. variance | Variance mean (since ) | Variance mean always |
| Quick rule of thumb: if you can count the number of trials, think Binomial. If you're counting events in a continuous interval and there's no natural cap, think Poisson. |
The Poisson Approximation to the Binomial
The Poisson distribution is actually the limiting case of the Binomial as and while the product stays constant. In practice, the approximation works well when:
- is large (commonly )
- is small (commonly )
- The product
To apply the approximation, set and use the Poisson PMF instead of the Binomial PMF.
Worked example: A factory produces 1,000 items per day, each with a 0.002 probability of being defective. What's the probability of exactly 3 defective items?
Using the Binomial directly would require computing . With the Poisson approximation, set :
This is sometimes called the Law of Rare Events: when you have many trials each with a tiny probability of success, the total count of successes follows an approximately Poisson distribution.