Geometric Distribution
The geometric distribution models how many trials you need before getting your first success in a series of independent experiments. Whether it's coin flips, product inspections, or job interviews, any time you're asking "how many attempts until it finally works?", the geometric distribution is your tool.
Geometric Distribution Probability Calculations
The geometric distribution applies when you have Bernoulli trials: repeated independent trials where each trial has only two outcomes (success or failure) and the probability of success stays the same every time.
The random variable represents the number of trials until the first success. The probability mass function (PMF) is:
where
The logic behind this formula: to get your first success on trial , you need failures in a row (each with probability ), followed by one success (probability ).
To calculate a geometric probability:
- Identify the probability of success for each trial.
- Determine , the specific trial number you're calculating the probability for.
- Plug into the PMF and simplify.
Example: A weighted coin has a 0.4 probability of landing heads. What's the probability that the first heads occurs on the 3rd flip?

Interpreting Geometric Distribution Parameters
Mean (expected value):
This tells you the average number of trials needed to get the first success. For example, since the probability of rolling a 6 on a fair die is , you'd expect on average rolls to get your first 6.
Notice the intuition: a lower probability of success means you'll need more trials on average.
Standard deviation:
This measures how much variability there is in the number of trials needed. A higher standard deviation means the actual number of attempts could differ a lot from the mean.
When applying these to real-world problems, always interpret in context. If for getting hired at each job interview, the mean is interviews, and the standard deviation is interviews. You'd say: "On average, a person needs 5 interviews to get hired, with a standard deviation of about 4.47 interviews, indicating quite a bit of variability."

Two Cases of Geometric Distributions
Your textbook or exam may define the geometric distribution in one of two ways. Read problems carefully to figure out which version is being used.
- Case 1: = number of trials until the first success (includes the success trial)
- can be 1, 2, 3, ...
- PMF: , where
- Mean:
- Case 2: = number of failures before the first success (does not include the success trial)
- can be 0, 1, 2, ...
- PMF: , where
- Mean:
The key difference: Case 1 starts at with exponent , while Case 2 starts at with exponent . The two are related by . Most intro stats courses use Case 1.
Related Concepts
The cumulative distribution function (CDF) gives the probability that the first success occurs within trials or fewer:
This is useful when a problem asks something like "what's the probability of getting at least one success in 5 tries?" rather than asking about a specific trial number.
The negative binomial distribution extends the geometric distribution to model the number of trials needed to achieve a specified number of successes (not just the first). The geometric distribution is the special case where you need exactly one success.