Fiveable

🎲Intro to Probability Unit 8 Review

QR code for Intro to Probability practice questions

8.3 Poisson distribution

8.3 Poisson distribution

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🎲Intro to Probability
Unit & Topic Study Guides

The Poisson Distribution

The Poisson distribution models how many times an event occurs within a fixed interval of time or space, given a known average rate. It's one of the most widely used discrete distributions because so many real-world counting problems fit its structure: customer arrivals per hour, typos per page, car accidents per month at an intersection.

Where the Binomial distribution counts successes in a fixed number of trials, the Poisson distribution counts events in a fixed window with no predetermined upper limit. That single difference changes when you reach for each tool.

Definition and Probability Mass Function

A random variable XX follows a Poisson distribution with parameter λ\lambda if its probability mass function (PMF) is:

P(X=k)=λkeλk!P(X = k) = \frac{\lambda^k \cdot e^{-\lambda}}{k!}

  • λ\lambda (lambda) is the average rate of occurrence over the interval
  • kk is the number of events you're asking about (0, 1, 2, 3, …)
  • ee is Euler's number (approximately 2.718)

The PMF is defined for all non-negative integers k=0,1,2,k = 0, 1, 2, \dots There's no upper cap on kk, though probabilities become vanishingly small for values far above λ\lambda.

For the PMF to be valid, three conditions must hold:

  1. Events occur independently of one another.
  2. The average rate λ\lambda is constant across the interval.
  3. Two events cannot happen at exactly the same instant (no simultaneous occurrences).

Worked example: A call center receives an average of 3 calls per minute (λ=3\lambda = 3). What's the probability of receiving exactly 5 calls in a given minute?

P(X=5)=35e35!=2430.04981200.1008P(X = 5) = \frac{3^5 \cdot e^{-3}}{5!} = \frac{243 \cdot 0.0498}{120} \approx 0.1008

So there's roughly a 10.1% chance of exactly 5 calls in that minute.

One useful additive property: if X1X_1 and X2X_2 are independent Poisson random variables with parameters λ1\lambda_1 and λ2\lambda_2, then X1+X2X_1 + X_2 is also Poisson with parameter λ1+λ2\lambda_1 + \lambda_2. This makes it easy to combine or rescale intervals. If you get 3 calls per minute on average, you'd use λ=15\lambda = 15 for a 5-minute window.

Applications

  • Queuing systems: customers arriving at a store, calls to a help desk
  • Spatial distributions: defects per square meter of fabric, potholes per mile of road
  • Rare events: mutations per DNA strand, insurance claims per year
  • Natural phenomena: radioactive decays per second, meteor sightings per hour

Parameters of the Poisson Distribution

Definition and Probability Mass Function, Probability distribution - wikidoc

Mean, Variance, and Shape

The Poisson distribution has a single parameter λ\lambda, and a remarkable feature: the mean and variance are both equal to λ\lambda.

  • Expected value: E[X]=λE[X] = \lambda
  • Variance: Var(X)=λ\text{Var}(X) = \lambda
  • Standard deviation: σ=λ\sigma = \sqrt{\lambda}

This mean-equals-variance property is a signature of the Poisson. If you're looking at count data and the sample variance is much larger or smaller than the sample mean, a Poisson model may not be appropriate.

The shape of the distribution depends on λ\lambda:

  • Small λ\lambda (say, 1 or 2): the distribution is noticeably right-skewed, with a peak at or near 0.
  • Large λ\lambda (say, 20+): the distribution becomes roughly symmetric and starts to resemble a normal distribution.

The mode (most probable value) is the largest integer less than or equal to λ\lambda. When λ\lambda is itself an integer, both λ\lambda and λ1\lambda - 1 are modes.

Interpreting Lambda

Think of λ\lambda as the answer to: "On average, how many times does this event happen per interval?" That interval could be one hour, one page, one square meter, or whatever unit fits your problem.

A few things to keep in mind:

  • λ\lambda must match the interval you're analyzing. If the rate is 6 emails per hour but you're looking at a 10-minute window, use λ=1\lambda = 1.
  • You can estimate λ\lambda from data by taking the sample mean of observed counts.
  • Changing λ\lambda shifts and stretches the entire distribution, so getting it right is critical for accurate predictions.

Probabilities and Moments of the Poisson Distribution

Definition and Probability Mass Function, Poisson distribution - Wikipedia

Probability Calculations

For a single value, plug directly into the PMF:

P(X=k)=λkeλk!P(X = k) = \frac{\lambda^k \cdot e^{-\lambda}}{k!}

For a range of values, use the cumulative distribution function (CDF):

P(Xk)=i=0kλieλi!P(X \leq k) = \sum_{i=0}^{k} \frac{\lambda^i \cdot e^{-\lambda}}{i!}

A common trick for "at least" problems: use the complement.

P(X1)=1P(X=0)=1eλP(X \geq 1) = 1 - P(X = 0) = 1 - e^{-\lambda}

Worked example: A website averages 2 server errors per day (λ=2\lambda = 2). What's the probability of no errors on a given day?

P(X=0)=20e20!=e20.1353P(X = 0) = \frac{2^0 \cdot e^{-2}}{0!} = e^{-2} \approx 0.1353

About a 13.5% chance. The probability of at least one error is 10.1353=0.86471 - 0.1353 = 0.8647, or roughly 86.5%.

Higher Moments

The moment generating function (MGF) is:

M(t)=eλ(et1)M(t) = e^{\lambda(e^t - 1)}

From the MGF (or by direct calculation), you can derive:

  • Second moment: E[X2]=λ2+λE[X^2] = \lambda^2 + \lambda
  • Skewness: 1λ\frac{1}{\sqrt{\lambda}} (decreases as λ\lambda grows, confirming the distribution becomes more symmetric)
  • Excess kurtosis: 1λ\frac{1}{\lambda} (approaches 0 for large λ\lambda, matching the normal distribution)

For large λ\lambda or kk values, calculating the PMF by hand gets tedious. Statistical software, calculator functions, or Poisson tables are the practical way to go.

Poisson vs. Binomial Distributions

When to Use Which

FeatureBinomialPoisson
What it countsSuccesses in nn fixed trialsEvents in a fixed interval
Upper limit on countYes (at most nn)No theoretical upper limit
Parametersnn (trials) and pp (success probability)λ\lambda (average rate)
Meannpnpλ\lambda
Variancenp(1p)np(1-p)λ\lambda
Mean vs. varianceVariance \leq mean (since 1p11-p \leq 1)Variance == mean always
Quick rule of thumb: if you can count the number of trials, think Binomial. If you're counting events in a continuous interval and there's no natural cap, think Poisson.

The Poisson Approximation to the Binomial

The Poisson distribution is actually the limiting case of the Binomial as nn \to \infty and p0p \to 0 while the product npnp stays constant. In practice, the approximation works well when:

  • nn is large (commonly n20n \geq 20)
  • pp is small (commonly p0.05p \leq 0.05)
  • The product np10np \leq 10

To apply the approximation, set λ=np\lambda = np and use the Poisson PMF instead of the Binomial PMF.

Worked example: A factory produces 1,000 items per day, each with a 0.002 probability of being defective. What's the probability of exactly 3 defective items?

Using the Binomial directly would require computing (10003)(0.002)3(0.998)997\binom{1000}{3}(0.002)^3(0.998)^{997}. With the Poisson approximation, set λ=1000×0.002=2\lambda = 1000 \times 0.002 = 2:

P(X=3)=23e23!=80.135360.1804P(X = 3) = \frac{2^3 \cdot e^{-2}}{3!} = \frac{8 \cdot 0.1353}{6} \approx 0.1804

This is sometimes called the Law of Rare Events: when you have many trials each with a tiny probability of success, the total count of successes follows an approximately Poisson distribution.