study guides for every class

that actually explain what's on your next test

Sum of squares

from class:

Intro to Statistics

Definition

The sum of squares (SS) is a measure of the total variability within a data set. It quantifies how much individual data points deviate from the mean.

congrats on reading the definition of sum of squares. now let's actually learn it.

ok, let's learn stuff

5 Must Know Facts For Your Next Test

  1. In ANOVA, the Total Sum of Squares (SST) is partitioned into the Between-Group Sum of Squares (SSB) and Within-Group Sum of Squares (SSW).
  2. The formula for calculating SST is $SST = \sum_{i=1}^{n} (X_i - \bar{X})^2$, where $X_i$ are individual observations and $\bar{X}$ is the overall mean.
  3. SSB measures the variation due to differences between group means and is calculated as $SSB = \sum_{j=1}^{k} n_j(\bar{X}_j - \bar{X})^2$, where $n_j$ are sample sizes and $\bar{X}_j$ are group means.
  4. SSW, or residual sum of squares, measures the variation within each group and is calculated as $SSW = \sum_{j=1}^{k} \sum_{i=1}^{n_j} (X_{ij} - \bar{X}_j)^2$.
  5. In the context of F-ratio, SS values are used to compute Mean Squares by dividing by their respective degrees of freedom before forming the F-statistic.

Review Questions

  • What does the sum of squares measure in a data set?
  • How is the Total Sum of Squares partitioned in ANOVA?
  • What are the components needed to calculate Between-Group Sum of Squares?
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.