The average rate of change of a function over an interval \([a, b]\) is the change in the function's value divided by the change in the input values, represented as $\frac{f(b) - f(a)}{b - a}$. It measures how much the function's output changes per unit increase in input over that interval.
congrats on reading the definition of average rate of change. now let's actually learn it.