🎲intro to statistics review

Least-Squares Line

Written by the Fiveable Content Team • Last updated September 2025
Written by the Fiveable Content Team • Last updated September 2025

Definition

A least-squares line, also known as the line of best fit, is a straight line that minimizes the sum of the squared differences between observed values and the values predicted by the line. It is used in linear regression to model the relationship between two variables.

5 Must Know Facts For Your Next Test

  1. The equation of a least-squares line is typically written as $\hat{y} = b_0 + b_1x$, where $\hat{y}$ represents predicted values, $b_0$ is the y-intercept, and $b_1$ is the slope.
  2. The least-squares method minimizes the sum of squared residuals, which are the vertical distances from each data point to the line.
  3. The slope ($b_1$) indicates how much $\hat{y}$ changes for a one-unit change in $x$.
  4. The intercept ($b_0$) represents the predicted value of $\hat{y}$ when $x = 0$.
  5. Calculating a least-squares line involves using formulas: $b_1 = \frac{n(\sum xy) - (\sum x)(\sum y)}{n(\sum x^2) - (\sum x)^2}$ and $b_0 = \bar{y} - b_1\bar{x}$.

Review Questions