Fiveable

Intro to Time Series Unit 5 Review

QR code for Intro to Time Series practice questions

5.1 Simple and weighted moving averages

5.1 Simple and weighted moving averages

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
Intro to Time Series
Unit & Topic Study Guides

Moving Averages

Simple moving averages calculation

A simple moving average (SMA) smooths out short-term fluctuations by taking the average of a fixed number of past observations. That fixed number is called the window size or order, often written as kk.

The formula for the SMA of order kk at time tt:

SMAt=1ki=0k1ytiSMA_t = \frac{1}{k} \sum_{i=0}^{k-1} y_{t-i}

where ytiy_{t-i} is the observed value at time tit-i. Every observation in the window gets equal weight of 1k\frac{1}{k}.

Example: 3-period SMA

Given observations: y1=10y_1 = 10, y2=12y_2 = 12, y3=8y_3 = 8, y4=14y_4 = 14

  1. At t=3t = 3: SMA3=13(10+12+8)=10SMA_3 = \frac{1}{3}(10 + 12 + 8) = 10
  2. At t=4t = 4: SMA4=13(12+8+14)=11.33SMA_4 = \frac{1}{3}(12 + 8 + 14) = 11.33

Notice that each new SMA value drops the oldest observation and adds the newest one. This "sliding window" is what makes it a moving average.

Simple moving averages calculation, time series - Interpreting moving average chart - Cross Validated

Weighted moving averages computation

A weighted moving average (WMA) lets you assign different weights to past observations, typically giving more importance to recent values. The key constraint is that all weights must sum to 1.

The formula for the WMA of order kk at time tt:

WMAt=i=0k1wiytiWMA_t = \sum_{i=0}^{k-1} w_i \, y_{t-i}

where wiw_i is the weight for the observation at time tit-i, and wi=1\sum w_i = 1.

Example: 3-period WMA with weights w0=0.5w_0 = 0.5, w1=0.3w_1 = 0.3, w2=0.2w_2 = 0.2

Given the same observations: y1=10y_1 = 10, y2=12y_2 = 12, y3=8y_3 = 8, y4=14y_4 = 14

  1. At t=3t = 3: WMA3=0.5(8)+0.3(12)+0.2(10)=9.6WMA_3 = 0.5(8) + 0.3(12) + 0.2(10) = 9.6
  2. At t=4t = 4: WMA4=0.5(14)+0.3(8)+0.2(12)=11.8WMA_4 = 0.5(14) + 0.3(8) + 0.2(12) = 11.8

Compare WMA4=11.8WMA_4 = 11.8 to SMA4=11.33SMA_4 = 11.33. The WMA reacts more strongly to the recent jump to 14 because it places half the total weight on the most recent observation. This is the core trade-off: a WMA tracks recent changes more closely, but it's also more sensitive to noise.

Simple moving averages calculation, ggplot2 - Add Moving average plot to time series plot in R - Stack Overflow

Advantages vs limitations of moving averages

  • Advantages
    • Simple to understand and implement
    • Smooths out short-term fluctuations, making underlying trends easier to see
    • Helps identify trend direction and potential reversals
  • Limitations
    • Always a lagging indicator because it relies entirely on past data
    • Can miss sudden shifts like outliers or structural breaks
    • Sensitive to the choice of order kk: too small and you get noisy output, too large and you over-smooth real changes
    • Does not account for seasonality, cyclical patterns, or other complex structure on its own

Order selection for moving averages

Choosing the right kk is about balancing smoothness against responsiveness. There's no single correct answer; it depends on your data and your goal.

Factors to consider:

  1. Seasonality: If your data has a seasonal cycle, set kk equal to the seasonal period (e.g., k=12k = 12 for monthly data with a yearly cycle). This averages out the seasonal effect.
  2. Noise level: Noisier data benefits from a larger kk to smooth out random variation.
  3. Trend stability: If the trend is steady, a larger kk works well. If the trend shifts frequently, a smaller kk keeps you closer to the current behavior.
  4. Responsiveness: Smaller orders react faster to recent changes but smooth less effectively.

In practice, you should try several values of kk and compare results. Use out-of-sample testing (hold back some data, forecast it, and measure error) to pick the order that generalizes best rather than just fitting the historical data well.