Fiveable

🍬Honors Algebra II Unit 13 Review

QR code for Honors Algebra II practice questions

13.1 Counting Principles and Probability

13.1 Counting Principles and Probability

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🍬Honors Algebra II
Unit & Topic Study Guides
Pep mascot

Counting principles and probability give you the tools to figure out how many outcomes are possible in a situation and how likely any particular outcome is. These ideas show up constantly in Honors Algebra II, and they connect directly to the binomial theorem work you've already done. This section covers permutations, combinations, probability rules, conditional probability, and how the binomial theorem applies to probability problems.

Permutations and Combinations for Counting

Pep mascot
more resources to help you study

Arranging Objects in a Specific Order

A permutation counts the number of ways to arrange objects when order matters. Choosing president, then vice president, then treasurer from a group is a permutation problem because the same three people in a different order produce a different result.

The formula is:

P(n,r)=n!(nr)!P(n, r) = \frac{n!}{(n - r)!}

  • nn = total number of objects
  • rr = number of objects being arranged

Example: How many ways can you arrange 5 books on a shelf?

P(5,5)=5!(55)!=1201=120 waysP(5, 5) = \frac{5!}{(5-5)!} = \frac{120}{1} = 120 \text{ ways}

Selecting Subsets of Objects

A combination counts the number of ways to select objects when order does not matter. Choosing 3 committee members from a group of 10 is a combination problem because the same three people in any order form the same committee.

The formula is:

C(n,r)=n!r!(nr)!C(n, r) = \frac{n!}{r!(n - r)!}

  • nn = total number of objects
  • rr = number of objects being selected

Example: How many ways can you choose 3 toppings from a list of 10?

C(10,3)=10!3!7!=7206=120 waysC(10, 3) = \frac{10!}{3! \cdot 7!} = \frac{720}{6} = 120 \text{ ways}

The quick way to remember the difference: if rearranging your selection would create a different outcome, use permutations. If rearranging changes nothing, use combinations.

Applying the Fundamental Counting Principle

The Fundamental Counting Principle says that if one event can happen in mm ways and a second independent event can happen in nn ways, then the two events together can happen in m×nm \times n ways. This extends to any number of events.

Example: You have 4 shirts and 3 pairs of pants. The number of possible outfits is 4×3=124 \times 3 = 12.

This principle is the backbone of most counting problems. Permutations and combinations are really just specialized applications of it.

Permutations and Combinations with Repetition

Sometimes objects can be selected more than once.

Permutations with repetition use the formula nrn^r, where nn is the number of choices for each position and rr is the number of positions.

  • Example: A 4-digit PIN using digits 0–9 allows repetition. That gives 104=10,00010^4 = 10{,}000 possible codes.

Combinations with repetition (sometimes called "stars and bars" problems) use:

C(n+r1,r)=(n+r1)!r!(n1)!C(n + r - 1, r) = \frac{(n + r - 1)!}{r!(n - 1)!}

  • Example: Selecting 3 scoops from 5 ice cream flavors (repeats allowed): C(5+31,3)=C(7,3)=35C(5 + 3 - 1, 3) = C(7, 3) = 35 ways.

Probabilities of Events

Arranging Objects in a Specific Order, How many ways to arrange books? Combination/permutation problem - Mathematics Stack Exchange

Probability Basics

Probability measures how likely an event is to occur, on a scale from 0 (impossible) to 1 (certain). The probability of event AA is written P(A)P(A).

The complement of event AA, written AA', is the event "not AA." Their probabilities always add to 1:

P(A)=1P(A)P(A') = 1 - P(A)

This is surprisingly useful. When a problem asks for the probability that at least one thing happens, it's often easier to calculate the complement (nothing happens) and subtract from 1.

Example: If P(rain)=0.3P(\text{rain}) = 0.3, then P(no rain)=10.3=0.7P(\text{no rain}) = 1 - 0.3 = 0.7.

Mutually Exclusive and Independent Events

Mutually exclusive events cannot happen at the same time. For mutually exclusive events AA and BB:

P(A or B)=P(A)+P(B)P(A \text{ or } B) = P(A) + P(B)

Example: Rolling a 1 or a 6 on a fair die: P(1 or 6)=16+16=13P(1 \text{ or } 6) = \frac{1}{6} + \frac{1}{6} = \frac{1}{3}

If events are not mutually exclusive (they can overlap), you need the general addition rule: P(A or B)=P(A)+P(B)P(A and B)P(A \text{ or } B) = P(A) + P(B) - P(A \text{ and } B). You subtract the overlap to avoid counting it twice.

Independent events are events where the outcome of one has no effect on the other. For independent events AA and BB:

P(A and B)=P(A)×P(B)P(A \text{ and } B) = P(A) \times P(B)

Example: Flipping two fair coins and getting heads on both: P(HH)=12×12=14P(HH) = \frac{1}{2} \times \frac{1}{2} = \frac{1}{4}

Don't confuse these two concepts. Mutually exclusive is about "or" (can they both happen?). Independence is about "and" (does one affect the other?). In fact, if two events both have nonzero probability, they cannot be both mutually exclusive and independent.

Law of Large Numbers

The Law of Large Numbers states that as you increase the number of trials, the experimental probability gets closer and closer to the theoretical probability.

Example: Flip a fair coin 10 times and you might get 7 heads (70%). Flip it 10,000 times and your heads percentage will be very close to 50%. The coin doesn't "remember" past flips; it's just that random variation averages out over many trials.

Conditional Probability and Independence

Conditional Probability

Conditional probability is the probability of event AA happening given that event BB has already happened:

P(AB)=P(A and B)P(B)P(A|B) = \frac{P(A \text{ and } B)}{P(B)}

Read P(AB)P(A|B) as "the probability of AA given BB." The key idea is that knowing BB occurred changes your sample space.

The multiplication rule rearranges this formula:

P(A and B)=P(B)×P(AB)P(A \text{ and } B) = P(B) \times P(A|B)

Example: From a standard 52-card deck, what's the probability that a card is a king given that it's a heart? There are 13 hearts, and 1 of them is a king, so P(KingHeart)=113P(\text{King}|\text{Heart}) = \frac{1}{13}.

Arranging Objects in a Specific Order, Permutations | College Algebra

Independence

Two events AA and BB are independent if knowing one occurred gives you no information about the other. Formally:

P(AB)=P(A)andP(BA)=P(B)P(A|B) = P(A) \quad \text{and} \quad P(B|A) = P(B)

To test whether events are independent, check if P(A and B)=P(A)×P(B)P(A \text{ and } B) = P(A) \times P(B). If that equation holds, they're independent. If not, they're dependent.

Example: Rolling a die and flipping a coin are independent. The die result tells you nothing about the coin.

Bayes' Theorem and Tree Diagrams

Bayes' Theorem lets you "reverse" a conditional probability. If you know P(BA)P(B|A) but need P(AB)P(A|B), Bayes' gives you:

P(AB)=P(BA)×P(A)P(B)P(A|B) = \frac{P(B|A) \times P(A)}{P(B)}

A common application: a medical test is 95% accurate, and 1% of the population has a disease. If someone tests positive, what's the probability they actually have the disease? Bayes' Theorem handles this, and the answer is often much lower than students expect because of false positives in the large healthy population.

Tree diagrams are a visual way to organize these problems:

  1. Draw branches for the first event (e.g., "has disease" vs. "no disease") and label each branch with its probability.
  2. From each first-event branch, draw branches for the second event (e.g., "tests positive" vs. "tests negative") and label with conditional probabilities.
  3. Multiply along a path to get the joint probability for that sequence of outcomes.
  4. Add the joint probabilities of all paths that lead to the outcome you want.

Tree diagrams make Bayes' Theorem problems much more manageable because you can see every possible path laid out.

Binomial Theorem in Probability

Binomial Expansion

The Binomial Theorem expands expressions of the form (x+y)n(x + y)^n:

(x+y)n=k=0nC(n,k)xnkyk(x + y)^n = \sum_{k=0}^{n} C(n, k) \cdot x^{n-k} \cdot y^k

Each term uses a binomial coefficient C(n,k)C(n, k), which you already know from the combinations formula. Pascal's Triangle gives you these same coefficients.

Example: Expand (2x3)4(2x - 3)^4. Here x=2xx = 2x, y=3y = -3, and n=4n = 4. The first few terms:

  • k=0k = 0: C(4,0)(2x)4(3)0=116x41=16x4C(4,0)(2x)^4(-3)^0 = 1 \cdot 16x^4 \cdot 1 = 16x^4
  • k=1k = 1: C(4,1)(2x)3(3)1=48x3(3)=96x3C(4,1)(2x)^3(-3)^1 = 4 \cdot 8x^3 \cdot (-3) = -96x^3
  • Continue for k=2,3,4k = 2, 3, 4 to get the full expansion.

Binomial Probability Formula

This is where the binomial theorem connects to probability. If you repeat an experiment nn times, each trial is independent, and each trial has success probability pp, the probability of exactly kk successes is:

P(X=k)=C(n,k)pk(1p)nkP(X = k) = C(n, k) \cdot p^k \cdot (1 - p)^{n - k}

  • nn = number of trials
  • kk = number of successes you want
  • pp = probability of success on a single trial
  • (1p)(1 - p) = probability of failure on a single trial

Example: What's the probability of getting exactly 3 heads in 5 coin flips?

P(X=3)=C(5,3)(0.5)3(0.5)2=100.1250.25=0.3125P(X = 3) = C(5, 3) \cdot (0.5)^3 \cdot (0.5)^2 = 10 \cdot 0.125 \cdot 0.25 = 0.3125

The C(n,k)C(n, k) piece counts how many different arrangements of 3 heads and 2 tails exist. The pk(1p)nkp^k(1-p)^{n-k} piece gives the probability of any single such arrangement.

Binomial Distribution Properties

For a binomial distribution with nn trials and success probability pp:

  • Mean (expected value): μ=np\mu = n \cdot p
  • Standard deviation: σ=np(1p)\sigma = \sqrt{n \cdot p \cdot (1 - p)}

The mean tells you the average number of successes you'd expect. The standard deviation tells you how much variation to expect around that average.

Cumulative probability is the probability of getting at most kk successes:

P(Xk)=i=0kC(n,i)pi(1p)niP(X \leq k) = \sum_{i=0}^{k} C(n, i) \cdot p^i \cdot (1 - p)^{n - i}

You calculate each individual P(X=i)P(X = i) for i=0,1,2,,ki = 0, 1, 2, \ldots, k and add them up.

Example: A factory has a 10% defect rate. In a batch of 10 items, what's the probability of at most 2 defective items?

P(X2)=P(X=0)+P(X=1)+P(X=2)P(X \leq 2) = P(X=0) + P(X=1) + P(X=2)

  • P(X=0)=C(10,0)(0.1)0(0.9)100.3487P(X=0) = C(10,0)(0.1)^0(0.9)^{10} \approx 0.3487
  • P(X=1)=C(10,1)(0.1)1(0.9)90.3874P(X=1) = C(10,1)(0.1)^1(0.9)^9 \approx 0.3874
  • P(X=2)=C(10,2)(0.1)2(0.9)80.1937P(X=2) = C(10,2)(0.1)^2(0.9)^8 \approx 0.1937

P(X2)0.9298P(X \leq 2) \approx 0.9298

So there's about a 93% chance of 2 or fewer defective items in the batch.