Discrete Mathematics

🧩Discrete Mathematics Unit 12 – Discrete Probability

Discrete probability lays the foundation for understanding uncertainty in various fields. It introduces key concepts like sample spaces, events, and probability axioms, providing tools to quantify and analyze random phenomena. This unit covers essential topics such as counting techniques, conditional probability, and independence. It also explores discrete probability distributions, their properties, and applications in real-world problem-solving, preparing students for advanced statistical analysis and decision-making.

Key Concepts and Definitions

  • Probability measures the likelihood of an event occurring and is expressed as a value between 0 and 1
    • 0 indicates an impossible event, while 1 represents a certain event
  • Sample space (SS) consists of all possible outcomes of an experiment or random process
  • An event (EE) is a subset of the sample space containing one or more outcomes
    • Events can be simple (single outcome) or compound (multiple outcomes)
  • Mutually exclusive events cannot occur simultaneously in a single trial
  • Collectively exhaustive events cover all possible outcomes in the sample space
  • Complementary events (EE and EE') are mutually exclusive and collectively exhaustive, satisfying P(E)+P(E)=1P(E) + P(E') = 1
  • Probability mass function (PMF) assigns probabilities to each possible outcome in a discrete probability distribution

Sample Spaces and Events

  • Defining the sample space is crucial for calculating probabilities and analyzing events
    • Example: Rolling a fair six-sided die has a sample space S={1,2,3,4,5,6}S = \{1, 2, 3, 4, 5, 6\}
  • Events are described using set notation and can be represented using Venn diagrams
    • Example: The event of rolling an even number is E={2,4,6}E = \{2, 4, 6\}
  • The union of two events (ABA \cup B) contains all outcomes that belong to either event A, event B, or both
  • The intersection of two events (ABA \cap B) contains outcomes that belong to both event A and event B
  • The complement of an event (EE') consists of all outcomes in the sample space that are not in event EE
  • The empty set (\emptyset) represents an impossible event with no outcomes
  • Probability of an event (P(E)P(E)) is the sum of probabilities of all outcomes in that event

Probability Axioms and Rules

  • Axiom 1: Non-negativity - The probability of any event EE is non-negative, i.e., P(E)0P(E) \geq 0
  • Axiom 2: Normalization - The probability of the entire sample space SS is equal to 1, i.e., P(S)=1P(S) = 1
  • Axiom 3: Additivity - For any two mutually exclusive events AA and BB, P(AB)=P(A)+P(B)P(A \cup B) = P(A) + P(B)
  • Multiplication rule: For any two events AA and BB, P(AB)=P(A)×P(BA)P(A \cap B) = P(A) \times P(B|A), where P(BA)P(B|A) is the conditional probability of BB given AA
  • Addition rule: For any two events AA and BB, P(AB)=P(A)+P(B)P(AB)P(A \cup B) = P(A) + P(B) - P(A \cap B)
    • If AA and BB are mutually exclusive, P(AB)=0P(A \cap B) = 0, simplifying the addition rule to P(AB)=P(A)+P(B)P(A \cup B) = P(A) + P(B)
  • Complement rule: For any event EE, P(E)=1P(E)P(E') = 1 - P(E), where EE' is the complement of EE

Counting Techniques for Probability

  • Fundamental counting principle (multiplication rule) states that if an experiment consists of nn independent stages with m1,m2,...,mnm_1, m_2, ..., m_n possible outcomes respectively, the total number of possible outcomes is m1×m2×...×mnm_1 \times m_2 \times ... \times m_n
    • Example: The number of possible outcomes when rolling two fair dice is 6×6=366 \times 6 = 36
  • Permutations count the number of ways to arrange nn distinct objects in a specific order
    • The formula for permutations is P(n,r)=n!(nr)!P(n, r) = \frac{n!}{(n-r)!}, where nn is the total number of objects and rr is the number of objects being arranged
  • Combinations count the number of ways to select rr objects from a set of nn distinct objects, disregarding the order
    • The formula for combinations is C(n,r)=(nr)=n!r!(nr)!C(n, r) = \binom{n}{r} = \frac{n!}{r!(n-r)!}
  • Permutations with repetition count the number of ways to arrange nn objects, allowing repetition, in rr positions
    • The formula for permutations with repetition is nrn^r
  • Combinations with repetition count the number of ways to select rr objects from a set of nn distinct objects, allowing repetition and disregarding the order
    • The formula for combinations with repetition is (n+r1r)\binom{n+r-1}{r}

Conditional Probability

  • Conditional probability P(AB)P(A|B) measures the probability of event AA occurring given that event BB has already occurred
    • The formula for conditional probability is P(AB)=P(AB)P(B)P(A|B) = \frac{P(A \cap B)}{P(B)}, where P(B)0P(B) \neq 0
  • Bayes' theorem relates conditional probabilities and can be used to update probabilities based on new information
    • Bayes' theorem states that P(AB)=P(BA)×P(A)P(B)P(A|B) = \frac{P(B|A) \times P(A)}{P(B)}, where P(B)0P(B) \neq 0
  • The law of total probability states that for a partition of the sample space {B1,B2,...,Bn}\{B_1, B_2, ..., B_n\}, P(A)=P(AB1)P(B1)+P(AB2)P(B2)+...+P(ABn)P(Bn)P(A) = P(A|B_1)P(B_1) + P(A|B_2)P(B_2) + ... + P(A|B_n)P(B_n)
    • This law is useful when the probability of an event AA is unknown, but conditional probabilities given a partition of the sample space are known
  • The multiplication rule for conditional probability states that P(AB)=P(AB)×P(B)=P(BA)×P(A)P(A \cap B) = P(A|B) \times P(B) = P(B|A) \times P(A)
  • Conditional probability is used in various applications, such as medical diagnosis, machine learning, and decision-making under uncertainty

Independence and Dependence

  • Two events AA and BB are independent if the occurrence of one event does not affect the probability of the other event
    • Mathematically, AA and BB are independent if P(AB)=P(A)×P(B)P(A \cap B) = P(A) \times P(B) or equivalently, P(AB)=P(A)P(A|B) = P(A) and P(BA)=P(B)P(B|A) = P(B)
  • If events AA and BB are independent, then their complements AA' and BB' are also independent
  • Pairwise independence does not imply mutual independence for three or more events
    • Example: Consider rolling a fair die. Let AA be the event of rolling an even number, BB be the event of rolling a number less than 4, and CC be the event of rolling a 1. AA, BB, and CC are pairwise independent but not mutually independent
  • Conditional independence: Events AA and BB are conditionally independent given event CC if P(ABC)=P(AC)×P(BC)P(A \cap B|C) = P(A|C) \times P(B|C)
  • Independence is a crucial concept in probability theory and is used in various applications, such as random variable generation, hypothesis testing, and Bayesian networks

Discrete Probability Distributions

  • A discrete probability distribution assigns probabilities to each possible outcome in a discrete sample space
  • The probability mass function (PMF) p(x)p(x) gives the probability of a discrete random variable XX taking on a specific value xx
    • Properties of a PMF: 0p(x)10 \leq p(x) \leq 1 for all xx, and xp(x)=1\sum_{x} p(x) = 1
  • The cumulative distribution function (CDF) F(x)F(x) gives the probability that a random variable XX takes a value less than or equal to xx
    • F(x)=P(Xx)=txp(t)F(x) = P(X \leq x) = \sum_{t \leq x} p(t)
  • Common discrete probability distributions include:
    • Bernoulli distribution: Models a single trial with two possible outcomes (success or failure)
    • Binomial distribution: Models the number of successes in a fixed number of independent Bernoulli trials
    • Geometric distribution: Models the number of trials needed to achieve the first success in a series of independent Bernoulli trials
    • Poisson distribution: Models the number of events occurring in a fixed interval of time or space, given a known average rate of occurrence
  • The expected value (mean) of a discrete random variable XX is E(X)=xx×p(x)E(X) = \sum_{x} x \times p(x)
  • The variance of a discrete random variable XX is Var(X)=E(X2)[E(X)]2Var(X) = E(X^2) - [E(X)]^2

Applications and Problem Solving

  • Probability theory has numerous real-world applications in various fields, such as:
    • Finance (portfolio optimization, risk management)
    • Engineering (reliability analysis, quality control)
    • Computer science (algorithm analysis, cryptography)
    • Biology (genetics, population dynamics)
  • When solving probability problems, it is essential to:
    • Clearly define the sample space and events of interest
    • Identify the appropriate probability rules, axioms, or distributions to apply
    • Use counting techniques (permutations, combinations) when necessary
    • Employ conditional probability and Bayes' theorem when dealing with dependent events or updating probabilities based on new information
  • Example: In a group of 10 people, 4 have brown eyes, and 6 have blue eyes. If 3 people are randomly selected from the group, what is the probability that exactly 2 of them have blue eyes?
    • Solution: Use the hypergeometric distribution, which models the number of successes in a fixed number of draws from a population without replacement
    • The probability is P(X=2)=(62)×(41)(103)0.5P(X = 2) = \frac{\binom{6}{2} \times \binom{4}{1}}{\binom{10}{3}} \approx 0.5
  • Simulation techniques, such as Monte Carlo methods, can be used to estimate probabilities in complex scenarios or when analytical solutions are difficult to obtain
  • Probability theory provides a foundation for statistical inference, hypothesis testing, and decision-making under uncertainty


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.