Conditional probability and independence are the tools you use to update beliefs when new information arrives and to determine whether events influence each other. In stochastic processes, nearly everything builds on these ideas: Markov chains rely on conditional independence, Bayesian inference is a direct application of Bayes' theorem, and the multiplication rule shows up constantly when you work with joint distributions.
Definition of conditional probability
Conditional probability quantifies how likely an event is once you know something else has happened. Instead of looking at the entire sample space, you zoom in on just the outcomes where the given event occurred and ask how probable your target event is within that restricted space.
Probability of an event given another
The notation is read as "the probability of A given B." It represents the probability that event A occurs when you already know event B has occurred. You're essentially restricting your attention to the subset of outcomes where B happens, then measuring how much of that subset also contains A.
Notation and formula
The conditional probability of A given B is:
- is the joint probability of A and B both occurring.
- is the marginal probability of B.
- This formula is only defined when , since division by zero is undefined.
The intuition: you take the probability that both events happen and normalize it by the probability of the event you're conditioning on.
Calculating conditional probabilities
There are several practical methods for computing conditional probabilities, depending on what information you have.
From a joint probability distribution
When you have the full joint distribution of two (or more) random variables, you can read off conditional probabilities directly. Find from the joint table, then divide by the marginal . The marginal is obtained by summing across the appropriate row or column.
Using a tree diagram
Tree diagrams break a problem into sequential stages. Each branch represents a possible outcome, and the probability on each branch is a conditional probability given the preceding branches.
- Draw the first set of branches for the initial event (e.g., B or ), labeling each with its probability.
- From each first-stage branch, draw branches for the second event (e.g., A or ), labeling each with the appropriate conditional probability.
- To find , multiply the probabilities along the path that leads to both A and B.
- To find , take that product and divide by .
With a contingency table
A contingency table (two-way table) displays frequencies or probabilities for two categorical variables. To calculate :
- Locate the cell where A and B intersect. This gives you the joint count (or probability).
- Find the row or column total for B.
- Divide the joint value by the B total.
For example, if 30 out of 200 people are in both category A and category B, and 80 people total are in category B, then .
Multiplication rule
The multiplication rule connects joint probabilities to conditional probabilities. It's the workhorse formula for computing the probability that multiple events all occur.
Deriving the general formula
Rearranging the conditional probability definition gives:
Equivalently, . Both forms are useful depending on which conditional probability is easier to find.
For three events, the rule extends via the chain rule:
This pattern generalizes to any finite number of events and is heavily used when working with stochastic processes.
For independent events
When A and B are independent, knowing one occurred tells you nothing about the other. The conditional probability collapses: . The multiplication rule simplifies to:
For dependent events
When A and B are dependent, the occurrence of one changes the probability of the other. You must use the full multiplication rule with the conditional probability:
A classic example: drawing cards without replacement. The probability of drawing two aces is .
_diagram_illustrating_event_A_given_event_B_in_stochastic_processes%22-220px-Conditional_probability.svg.png)
Law of total probability
The law of total probability lets you compute by breaking the sample space into simpler pieces and summing up contributions from each piece.
Partitioning the sample space
A partition of the sample space is a collection of events that are:
- Mutually exclusive: no two can occur at the same time ( for ).
- Exhaustive: together they cover every possible outcome ().
Applying the law of total probability
Given a partition :
Each term represents the contribution to from the scenario where occurs. You weight each conditional probability by how likely that scenario is, then add them up.
With a tree diagram
On a tree diagram, the law of total probability corresponds to summing all paths that lead to event A:
- The first-level branches represent the partition .
- The second-level branches represent A occurring (or not) given each .
- Multiply along each path that ends at A.
- Sum all those products to get .
Bayes' theorem
Bayes' theorem lets you "reverse" a conditional probability. If you know but need , Bayes' theorem is the bridge.
Derivation using conditional probability
Start from two expressions for the joint probability:
Setting these equal and solving for :
The denominator is often computed using the law of total probability.
Updating probabilities with new information
Bayes' theorem has a natural interpretation in terms of updating beliefs:
- Prior : your belief about A before seeing any new data.
- Likelihood : how probable the observed data B is if A were true.
- Posterior : your updated belief about A after observing B.
- Evidence : the total probability of observing B (acts as a normalizing constant).
The posterior is proportional to the prior times the likelihood: .
Applications in decision making
Bayes' theorem appears wherever you need to update probabilities with evidence:
- Medical diagnosis: A test has sensitivity and specificity . Bayes' theorem converts these into the clinically useful quantity: , the positive predictive value.
- Spam filtering: Given features of an email, Bayes' theorem updates the probability that it's spam.
- Machine learning: Model parameters are updated as new data arrives, treating the parameter as the "hypothesis" and the data as the "evidence."
Independence of events
Independence means that learning one event occurred gives you zero information about whether another event occurred. This property dramatically simplifies calculations.
Definition of independence
Two events A and B are independent if and only if:
Equivalently, (assuming ). The joint probability factors into the product of the marginals.
_diagram_illustrating_event_A_given_event_B_in_stochastic_processes%22-conditional_prob_01.gif)
Checking for independence
To test whether A and B are independent:
- Compute , , and .
- Check whether .
- If equality holds, the events are independent. If not, they're dependent.
Alternatively, check whether . Both tests are equivalent.
Properties of independent events
- If A and B are independent, then and are also independent (and likewise and , and and ).
- Pairwise independence does not imply mutual independence. Three events A, B, C can each be pairwise independent while still being mutually dependent. Mutual independence requires that every subset satisfies the product rule, including the triple: .
- For mutually independent events, any sub-collection is also independent.
Conditional independence
Conditional independence extends the idea of independence to situations where a third event provides context. Two events that are dependent overall might become independent once you condition on a third event (or vice versa).
Definition and notation
Events A and B are conditionally independent given C if:
This is denoted . Intuitively, once you know C has occurred, learning A gives you no additional information about B.
Conditional independence vs marginal independence
These are genuinely different properties. Neither one implies the other:
- A and B can be marginally dependent but conditionally independent given C. (Example: two diseases that share a common symptom become independent once you condition on the symptom.)
- A and B can be marginally independent but conditionally dependent given C. (This is known as Berkson's paradox or "explaining away.")
Confusing these two types of independence is a common source of modeling errors.
Markov chains and conditional independence
Markov chains are stochastic processes where the Markov property holds: the future state depends only on the present state, not on the full history. Formally, for states :
This is a conditional independence statement: . The Markov property is what makes these processes tractable, since you only need to track the current state rather than the entire past.
Applications of conditional probability
In medical testing and diagnosis
Medical tests are characterized by two conditional probabilities:
- Sensitivity: , the true positive rate.
- Specificity: , the true negative rate.
What patients and doctors actually want is the positive predictive value . Bayes' theorem, combined with the disease prevalence (the prior), converts sensitivity and specificity into this clinically relevant number. When prevalence is low, even a highly specific test can have a surprisingly low positive predictive value.
In machine learning and data science
- Naive Bayes classifiers predict class labels by applying Bayes' theorem and assuming conditional independence among features given the class. Despite this strong assumption, they often perform well in practice (e.g., text classification).
- Bayesian networks represent probabilistic relationships among variables as directed acyclic graphs. Each node stores a conditional probability distribution given its parents, and the full joint distribution factors according to the graph structure.
In genetics and probability
Conditional probability underlies genetic inheritance calculations. Punnett squares, for instance, display the conditional probabilities of offspring genotypes given parental genotypes. In population genetics, conditional probability is used to study how allele frequencies shift under processes like natural selection, genetic drift, and migration.