Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Forecasting is the heart of time series analysis—it's why we study patterns, stationarity, and autocorrelation in the first place. You're being tested not just on how each method works, but on when to apply it. The key decision points involve understanding your data's characteristics: Does it have a trend? Seasonality? Is it stationary? Each forecasting method makes different assumptions, and matching the right tool to the right data structure is what separates a mediocre forecast from an accurate one.
These methods build on each other conceptually. Simple approaches like moving averages lay the groundwork for understanding how we weight past observations, while ARIMA models combine multiple techniques into a flexible framework. When you encounter exam questions, don't just memorize formulas—know what data characteristics each method handles best and what assumptions it requires. That's how you'll tackle both multiple choice and free-response questions with confidence.
These methods reduce noise in your data by averaging or weighting past observations. The core principle is that random fluctuations cancel out when you combine multiple observations, revealing the underlying signal.
Compare: Moving Average vs. Exponential Smoothing—both smooth out noise, but MA uses a fixed window while exponential smoothing uses all past data with decaying weights. If asked which responds faster to sudden changes, exponential smoothing wins when is high.
These methods model the relationship between current values and past values directly. The underlying assumption is that time series exhibit persistence—what happened recently influences what happens next.
Compare: ARIMA vs. SARIMA—both handle trends through differencing, but only SARIMA explicitly models repeating seasonal patterns. On an FRQ asking you to forecast monthly retail sales with clear December spikes, SARIMA is your answer.
These approaches break complex time series into interpretable components. The principle is that observed data equals the combination of systematic patterns (trend, seasonality) plus random noise.
Compare: Additive vs. Multiplicative Decomposition—both extract the same components, but the relationship differs. If December sales are always $10,000 above average, use additive. If December is always 20% above average, use multiplicative.
These methods explicitly model both trend and seasonality for direct forecasting. They extend smoothing concepts to handle the complexity of real-world business and economic data.
Compare: Holt-Winters vs. Prophet—both handle trend and seasonality, but Holt-Winters requires cleaner data and manual parameter selection, while Prophet automates much of the process and handles irregularities. For exam purposes, know Holt-Winters mechanics; for applied projects, Prophet often wins.
These methods frame forecasting as a relationship between variables. The core idea is that time series values depend on predictable factors that can be modeled explicitly.
Compare: Regression vs. ARIMA—regression explicitly models relationships with external variables, while ARIMA models internal dynamics of the series. Use regression when you have meaningful predictors; use ARIMA when the series history is your best information.
| Concept | Best Examples |
|---|---|
| Smoothing without trend/seasonality | Moving Average, Simple Exponential Smoothing |
| Smoothing with trend | Double Exponential Smoothing, Holt's Method |
| Smoothing with trend + seasonality | Holt-Winters, Triple Exponential Smoothing |
| Autoregressive modeling | AR Models, ARIMA |
| Seasonal pattern modeling | SARIMA, Holt-Winters, Prophet |
| Decomposition | Additive Decomposition, Multiplicative Decomposition |
| External variable forecasting | Regression Analysis, Prophet (with regressors) |
| Handling messy real-world data | Prophet |
Which two methods both use weighted combinations of past observations but differ in how they assign weights—and when would you prefer one over the other?
You're given a time series with an upward trend and seasonal spikes every 12 months. Which methods from this guide could handle both features, and what parameters would you need to specify?
Compare and contrast ARIMA and regression for time series forecasting. What assumption does regression make that ARIMA doesn't, and how might this cause problems?
If decomposition reveals that seasonal fluctuations grow proportionally larger as the series level increases, which decomposition type should you use—and which Holt-Winters variant matches this pattern?
An FRQ presents quarterly GDP data with a clear trend and asks you to specify an appropriate SARIMA model. What would the seasonal period be, and how would you determine whether seasonal differencing () is needed?