Fiveable

📊Honors Statistics Unit 11 Review

QR code for Honors Statistics practice questions

11.8 Lab 2: Chi-Square Test of Independence

11.8 Lab 2: Chi-Square Test of Independence

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
📊Honors Statistics
Unit & Topic Study Guides
Pep mascot

Chi-Square Test of Independence

The chi-square test of independence determines whether two categorical variables are associated with each other. For example, you might ask: Is there a relationship between gender and product preference? The test works by comparing the frequencies you actually observe in your data to the frequencies you'd expect if the two variables were completely unrelated.

This lab builds on the chi-square concepts from earlier in the unit, so make sure you're comfortable with observed vs. expected frequencies and how the chi-square statistic works before diving in.

Pep mascot
more resources to help you study

Chi-Square Test of Independence

Chi-square test for independence, Test of Independence (3 of 3) | Concepts in Statistics

Chi-square test for independence

This test evaluates whether a significant association exists between two categorical variables.

  • Null hypothesis (H0H_0): No association between the two categorical variables (they are independent)
  • Alternative hypothesis (HaH_a): An association exists between the two categorical variables (they are not independent)

Conducting the test involves these steps:

  1. Build a contingency table with observed frequencies for each combination of the two categorical variables.
  2. Calculate expected frequencies for each cell, assuming no association between the variables:

Expected frequency=row total×column totalgrand total\text{Expected frequency} = \frac{\text{row total} \times \text{column total}}{\text{grand total}}

  1. Calculate the chi-square test statistic:

χ2=(OE)2E\chi^2 = \sum \frac{(O - E)^2}{E}

where OO = observed frequency and EE = expected frequency for each cell.

  1. Find the degrees of freedom:

df=(r1)(c1)df = (r - 1)(c - 1)

where rr = number of rows and cc = number of columns.

  1. Compare the calculated χ2\chi^2 value to the critical value from the chi-square distribution table at your chosen significance level (α\alpha, typically 0.05). Alternatively, compare the p-value directly to α\alpha.
  2. Make your decision: If the calculated χ2\chi^2 exceeds the critical value (or if p<αp < \alpha), reject H0H_0 and conclude that a significant association exists.

This test is a form of contingency analysis, which broadly refers to examining relationships between categorical variables arranged in a table.

Chi-square test for independence, Test of Independence (2 of 3) | Concepts in Statistics

Interpretation of chi-square results

The p-value is the probability of getting a chi-square statistic as extreme as (or more extreme than) the one you calculated, assuming H0H_0 is true.

  • A small p-value (typically < 0.05) provides strong evidence against H0H_0, suggesting the two variables are associated. For instance, you might find a significant association between gender and preference for a particular product.
  • A large p-value (≥ 0.05) means you don't have enough evidence to reject H0H_0. The data are consistent with the variables being independent. For example, a study might find no significant association between age group and political affiliation.

Degrees of freedom (dfdf) reflect the size of your contingency table. A 2×2 table has df=1df = 1; a 3×4 table has df=6df = 6. The dfdf value determines which chi-square distribution you use to find your critical value. Higher dfdf means a larger table and a potentially more complex relationship to evaluate.

Limitations of chi-square tests

Assumptions you need to meet:

  • Random sampling: Data should come from a random sample of the population of interest.
  • Independence of observations: Each observation must be independent of the others (no repeated measures on the same subject).
  • Minimum expected frequency: Every cell in the contingency table should have an expected frequency of at least 5. If this is violated, you can combine categories (e.g., merge similar age groups) or use an alternative test like Fisher's exact test.

Key limitations to keep in mind:

  • The test only tells you whether an association exists. It does not tell you how strong or in what direction the relationship goes. To measure strength, use something like Cramér's V or the phi coefficient.
  • The test is sensitive to sample size. With very large samples, even trivially small associations can come back as statistically significant, so always consider practical significance alongside statistical significance.
  • The test does not account for confounding variables that might be driving the apparent relationship. More advanced methods like logistic regression can help control for confounders.
  • As a nonparametric test, it does not assume the data follow any particular distribution (like the normal distribution), which makes it flexible but also limited in what it can tell you.

Statistical Inference and Goodness of Fit

Chi-square tests fall under the broader umbrella of statistical inference, where you use sample data to draw conclusions about a population. The test of independence is closely related to the chi-square goodness-of-fit test, but they answer different questions. The goodness-of-fit test compares observed frequencies to expected frequencies for a single categorical variable, while the test of independence examines the relationship between two categorical variables.