Written by the Fiveable Content Team • Last updated September 2025
Written by the Fiveable Content Team • Last updated September 2025
Definition
Sum of Squared Errors (SSE) measures the total deviation of observed values from the values predicted by a regression model. It is calculated by summing the squared differences between observed and predicted values.
5 Must Know Facts For Your Next Test
SSE is used to assess the fit of a regression model; lower SSE indicates a better fit.
The formula for SSE is $$ \sum_{i=1}^{n} (y_i - \hat{y}_i)^2 $$, where $y_i$ are the observed values and $\hat{y}_i$ are the predicted values.
SSE is always non-negative because it sums squared terms.
It plays a crucial role in calculating other key statistics, such as Mean Squared Error (MSE) and R-squared.
SSE can be compared across different models to determine which one better fits the data.
Review Questions
Related terms
Mean Squared Error (MSE): Mean Squared Error (MSE) is the average of the squared differences between observed and predicted values. It is calculated as $$ MSE = \frac{SSE}{n} $$, where $n$ is the number of observations.
$R^2$ (R-squared) measures the proportion of variance in the dependent variable that is predictable from the independent variables. It ranges from 0 to 1, with higher values indicating better model fit.