upgrade
upgrade

Intro to Time Series

Key Forecasting Methods

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

Forecasting is the heart of time series analysis—it's why we study patterns, stationarity, and autocorrelation in the first place. You're being tested not just on how each method works, but on when to apply it. The key decision points involve understanding your data's characteristics: Does it have a trend? Seasonality? Is it stationary? Each forecasting method makes different assumptions, and matching the right tool to the right data structure is what separates a mediocre forecast from an accurate one.

These methods build on each other conceptually. Simple approaches like moving averages lay the groundwork for understanding how we weight past observations, while ARIMA models combine multiple techniques into a flexible framework. When you encounter exam questions, don't just memorize formulas—know what data characteristics each method handles best and what assumptions it requires. That's how you'll tackle both multiple choice and free-response questions with confidence.


Smoothing-Based Methods

These methods reduce noise in your data by averaging or weighting past observations. The core principle is that random fluctuations cancel out when you combine multiple observations, revealing the underlying signal.

Moving Average (MA)

  • Averages a fixed window of past observations—the window size kk determines how much smoothing occurs (larger windows = smoother but slower to react)
  • Simple MA uses equal weights; weighted MA assigns greater importance to recent observations, improving responsiveness
  • Best for stationary data without strong trends—the method lags behind trend changes because it treats all windowed observations similarly

Exponential Smoothing

  • Applies exponentially decreasing weights to all past observations using smoothing parameter α\alpha—recent data matters most, but older data still contributes
  • Simple exponential smoothing works for data with no trend or seasonality; double adds trend; triple adds seasonality
  • More adaptive than moving average because it responds quickly to changes while maintaining memory of the entire series history

Compare: Moving Average vs. Exponential Smoothing—both smooth out noise, but MA uses a fixed window while exponential smoothing uses all past data with decaying weights. If asked which responds faster to sudden changes, exponential smoothing wins when α\alpha is high.


Autoregressive Frameworks

These methods model the relationship between current values and past values directly. The underlying assumption is that time series exhibit persistence—what happened recently influences what happens next.

Autoregressive (AR) Models

  • Predicts future values as a linear combination of pp past values—the equation is Yt=c+ϕ1Yt1+ϕ2Yt2+...+ϕpYtp+ϵtY_t = c + \phi_1 Y_{t-1} + \phi_2 Y_{t-2} + ... + \phi_p Y_{t-p} + \epsilon_t
  • Model order pp is identified using PACF (partial autocorrelation function)—significant spikes indicate how many lags to include
  • Requires stationarity—if your data has trends or changing variance, you must transform it first

Autoregressive Integrated Moving Average (ARIMA)

  • Combines AR and MA components with differencing—the (p,d,q)(p, d, q) parameters specify autoregressive order, differencing order, and moving average order respectively
  • Differencing (dd) transforms non-stationary data into stationary data by computing changes between consecutive observations
  • The workhorse model for non-seasonal trending data—flexible enough to capture many real-world patterns when properly specified

Seasonal ARIMA (SARIMA)

  • Extends ARIMA with seasonal parameters (P,D,Q)s(P, D, Q)_s—captures patterns that repeat at fixed intervals (monthly, quarterly, yearly)
  • Seasonal differencing removes seasonal patterns just as regular differencing removes trends
  • Full notation is ARIMA(p,d,q)(P,D,Q)s(p,d,q)(P,D,Q)_s—exam questions often test whether you can identify which parameters address which data features

Compare: ARIMA vs. SARIMA—both handle trends through differencing, but only SARIMA explicitly models repeating seasonal patterns. On an FRQ asking you to forecast monthly retail sales with clear December spikes, SARIMA is your answer.


Trend and Seasonal Decomposition

These approaches break complex time series into interpretable components. The principle is that observed data equals the combination of systematic patterns (trend, seasonality) plus random noise.

Trend Analysis

  • Identifies long-term directional movement in the data—can be linear, exponential, or polynomial depending on the pattern
  • Fitted using regression with time as the independent variable: Yt=β0+β1t+ϵtY_t = \beta_0 + \beta_1 t + \epsilon_t for linear trends
  • Critical first step in understanding your data—determines whether differencing or detrending is needed before applying other methods

Decomposition Methods

  • Separates series into trend (TtT_t), seasonal (StS_t), and residual (RtR_t) components—makes each pattern visible and analyzable
  • Additive decomposition assumes Yt=Tt+St+RtY_t = T_t + S_t + R_t; multiplicative assumes Yt=Tt×St×RtY_t = T_t \times S_t \times R_t
  • Choose additive when seasonal swings are constant; choose multiplicative when seasonal variation grows with the level of the series

Compare: Additive vs. Multiplicative Decomposition—both extract the same components, but the relationship differs. If December sales are always $10,000 above average, use additive. If December is always 20% above average, use multiplicative.


Trend-Seasonal Forecasting Methods

These methods explicitly model both trend and seasonality for direct forecasting. They extend smoothing concepts to handle the complexity of real-world business and economic data.

Holt-Winters Method

  • Triple exponential smoothing with three parametersα\alpha for level, β\beta for trend, and γ\gamma for seasonality
  • Additive version for constant seasonal swings; multiplicative version for proportional seasonal effects
  • Practical advantage: updates all three components as new data arrives, making it ideal for rolling forecasts in business applications

Prophet (Facebook's Forecasting Tool)

  • Additive regression model with components for trend, seasonality, and holiday effects: y(t)=g(t)+s(t)+h(t)+ϵty(t) = g(t) + s(t) + h(t) + \epsilon_t
  • Handles missing data and outliers gracefully—robust to the messiness of real-world datasets
  • Automatic changepoint detection identifies where trends shift, reducing manual intervention in model specification

Compare: Holt-Winters vs. Prophet—both handle trend and seasonality, but Holt-Winters requires cleaner data and manual parameter selection, while Prophet automates much of the process and handles irregularities. For exam purposes, know Holt-Winters mechanics; for applied projects, Prophet often wins.


Regression-Based Approaches

These methods frame forecasting as a relationship between variables. The core idea is that time series values depend on predictable factors that can be modeled explicitly.

Regression Analysis

  • Models YY as a function of predictor variables—for time series, time itself or lagged values can serve as predictors
  • Linear form: Yt=β0+β1X1t+β2X2t+ϵtY_t = \beta_0 + \beta_1 X_{1t} + \beta_2 X_{2t} + \epsilon_t—can include trend, seasonal dummies, or external variables
  • Watch for autocorrelated residuals—standard regression assumes independent errors, which time series often violate (requires correction)

Compare: Regression vs. ARIMA—regression explicitly models relationships with external variables, while ARIMA models internal dynamics of the series. Use regression when you have meaningful predictors; use ARIMA when the series history is your best information.


Quick Reference Table

ConceptBest Examples
Smoothing without trend/seasonalityMoving Average, Simple Exponential Smoothing
Smoothing with trendDouble Exponential Smoothing, Holt's Method
Smoothing with trend + seasonalityHolt-Winters, Triple Exponential Smoothing
Autoregressive modelingAR Models, ARIMA
Seasonal pattern modelingSARIMA, Holt-Winters, Prophet
DecompositionAdditive Decomposition, Multiplicative Decomposition
External variable forecastingRegression Analysis, Prophet (with regressors)
Handling messy real-world dataProphet

Self-Check Questions

  1. Which two methods both use weighted combinations of past observations but differ in how they assign weights—and when would you prefer one over the other?

  2. You're given a time series with an upward trend and seasonal spikes every 12 months. Which methods from this guide could handle both features, and what parameters would you need to specify?

  3. Compare and contrast ARIMA and regression for time series forecasting. What assumption does regression make that ARIMA doesn't, and how might this cause problems?

  4. If decomposition reveals that seasonal fluctuations grow proportionally larger as the series level increases, which decomposition type should you use—and which Holt-Winters variant matches this pattern?

  5. An FRQ presents quarterly GDP data with a clear trend and asks you to specify an appropriate SARIMA model. What would the seasonal period ss be, and how would you determine whether seasonal differencing (DD) is needed?