Written by the Fiveable Content Team โข Last updated September 2025
Written by the Fiveable Content Team โข Last updated September 2025
Definition
Intervals are ranges of values used to group data points in histograms, frequency polygons, and time series graphs. They help simplify complex datasets by categorizing data into manageable segments.
5 Must Know Facts For Your Next Test
Intervals must be mutually exclusive and collectively exhaustive to ensure all data points are included without overlap.
The choice of interval width can significantly affect the appearance and interpretation of a histogram or frequency polygon.
Equal-width intervals are commonly used but sometimes unequal intervals are necessary for better representation of data distribution.
In time series graphs, intervals often represent regular time periods such as days, months, or years.
Determining the number of intervals can involve using formulas like Sturges' Rule or the square root choice method.