Explore key methods for detecting and addressing seasonality in time series, including ACF/PACF analysis, SARIMA models, and dummy-variable approaches, with practical financial examples and tips for the CFA® Level II exam.
Most time-series data in finance and economics exhibit recurring patterns at specific intervals—what we call seasonality. You might be looking at quarterly corporate earnings and noticing that, every third quarter, there’s a spike in consumer spending for back-to-school shopping. Or you might see how retail sales skyrocket in December when people scramble for holiday gifts. These patterns aren’t random; they show up year after year (or quarter after quarter), so if we ignore them, our forecasts could be way off.
Let me share a quick personal anecdote. I remember years ago I was analyzing monthly sales at a ski resort. I was just a newbie, so it never occurred to me that January’s data might be drastically different from July’s simply because the slopes were closed in summer. Well, my naive forecast fell flat on its face. That’s how I learned—sometimes painfully—that seasonality is a big deal. My guess is, you probably want to avoid that sort of embarrassment.
In CFA® examinations and in real-life professional settings, time-series data often contain clear seasonal fluctuations. Our task is to detect such patterns, properly model them, and then generate forecasts or insights that are realistic. Let’s break down how to do that.
One of the key tools for detecting seasonality in a time series is the autocorrelation function (ACF). You’ll recall from earlier sections (see 6.1–6.4) that the ACF measures how correlated a time series is with its own past values. The partial autocorrelation function (PACF) refines this idea by controlling for intermediate lags.
If a series has strong seasonality of period “s” (e.g., 12 months, 4 quarters), you’ll likely see spikes or notable patterns in your ACF (and possibly in your PACF) at multiples of that seasonal lag. For instance, with monthly data and annual seasonality, observe whether the correlation at lag 12 is abnormally high. The same logic can apply for weekly or quarterly data.
Below is a brief mermaid flowchart summarizing the broad steps we take when checking for seasonality:
flowchart LR A["Collect Data"] --> B["Perform <br/>Exploratory Analysis"] B["Perform <br/>Exploratory Analysis"] --> C["Plot ACF/PACF"] C["Plot ACF/PACF"] --> D["Identify <br/>Seasonality"] D["Identify <br/>Seasonality"] --> E["Model <br/>Seasonal Terms or <br/>Dummy Variables"]
It might look a bit trivial, but trust me, you want to ensure you carefully do each step. Jumping ahead without confirming you even have seasonality can lead to silly mistakes.
Let’s say you have monthly retail sales data from January 2010 to December 2022. You suspect a seasonal pattern around the December holiday shopping rush. You can plot the ACF up to lag 12 or 24. If the ACF has a prominent spike at multiples of 12, that’s your first big clue. The next step is: do you see these patterns in the raw time-series plot? Is there a big jump each December? If so, you likely have a strong seasonal component.
Once you confirm seasonality, the next challenge is: how do we fix or model that? We’ve got two popular routes:
• Build an explicit seasonal model, like SARIMA (Seasonal ARIMA).
• Incorporate dummy variables in a regression or similar framework.
Seasonal ARIMA—often denoted SARIMA(p, d, q)(P, D, Q)[s]—extends ARIMA by including seasonal autoregressive (AR) and moving average (MA) terms. The notation (P, D, Q) indicates the seasonal parts, while “s” is the seasonal period (like 12 for monthly data or 4 for quarterly).
Here’s a generic SARIMA equation using KaTeX notation:
$$ (1 - \phi_1 B - \dots - \phi_p B^p)(1 - \Phi_1 B^s - \dots - \Phi_P B^{sP}) (y_t - \mu) = (1 + \theta_1 B + \dots + \theta_q B^q)(1 + \Theta_1 B^s + \dots + \Theta_Q B^{sQ}) ,\epsilon_t $$
• \(p, q\): Nonseasonal AR and MA orders.
• \(P, Q\): Seasonal AR and MA orders.
• \(d\): Nonseasonal differencing order.
• \(D\): Seasonal differencing order, if needed.
• \(s\): Season length (12 for monthly, 4 for quarterly, etc.).
By carefully tuning these parameters to match your data, you model both the short-term, nonseasonal dynamics and the cyclical seasonal fluctuations. That said, picking the right combination of p, d, q, P, D, Q, and s requires analyzing ACF/PACF plots and might involve iterative approaches like the Box-Jenkins methodology.
What if you’re running a plain regression model but want to account for seasonality? No problem. Create dummy variables for each season or period you suspect is systematically different. For monthly data, you might have S = 12 months. Let \(\text{Month}_k\) be a dummy set to 1 if the observation is in month k, and 0 otherwise.
You can write something like this:
Here, December might be the excluded category to avoid the dummy-variable trap (i.e., the reference month). If the coefficient \(\beta_5\) is positive and significant, that suggests the 5th month systematically increases sales compared to December. This is a neat, flexible approach when an advanced time-series model is overkill or you want to treat each period as a distinct shift.
If you’re curious how this might look:
1import pandas as pd
2import statsmodels.formula.api as smf
3
4# And we extracted 'month' as an integer 1-12 from 'date'
5df['month_dummy'] = df['date'].dt.month.astype('category')
6
7model = smf.ols('sales ~ C(month_dummy)', data=df).fit()
8print(model.summary())
We use “C(month_dummy)” to indicate that it’s a categorical variable. The summary table will show you each month’s estimated effect.
Another subtlety is whether your seasonality is additive or multiplicative:
• Additive Seasonality: The effect of the season adds a fixed amount. Typically, the series might look something like base + seasonal pattern.
• Multiplicative Seasonality: The seasonal effect scales the data (e.g., the amplitude grows as the level of the series grows).
In practice, you can sometimes transform your data through a log transform to convert multiplicative seasonality into an additive form. That’s a common trick: if your data has growth trends or amplitude changes, consider using \(\ln(y_t)\) before applying seasonal differencing or insertion of dummy variables.
Sometimes a model that looks fine at first completely falls apart in the presence of a strong seasonal pattern. If you ignore seasonality:
• Residuals might show high autocorrelation, indicating that your model is systematically failing for certain months or quarters.
• Forecast confidence intervals could be misleadingly narrow or wide.
• Performance metrics (like mean squared error) might look good in some months but be terrible in others.
In exam scenarios, watch for item-set vignettes that present suspiciously cyclical data. They might want you to demonstrate how to difference the series at lag 12 or incorporate monthly dummy variables. Failing to do so often means you’re missing a crucial exam point.
• Seasonality: Recurring, predictable patterns that repeat over fixed intervals (e.g., monthly, quarterly).
• Autocorrelation Function (ACF): A measure of correlation between a series and itself, separated by different time lags.
• Dummy Variable: A binary (0 or 1) variable representing categories or seasonal factors in a regression.
• SARIMA: Seasonal ARIMA, a time-series model combining both nonseasonal and seasonal parameters.
• Hamilton, J.D. (1994). Time Series Analysis. Princeton University Press.
• Box, G.E.P., Jenkins, G.M., & Reinsel, G.C. (2008). Time Series Analysis: Forecasting and Control. John Wiley & Sons.
Important Notice: FinancialAnalystGuide.com provides supplemental CFA study materials, including mock exams, sample exam questions, and other practice resources to aid your exam preparation. These resources are not affiliated with or endorsed by the CFA Institute. CFA® and Chartered Financial Analyst® are registered trademarks owned exclusively by CFA Institute. Our content is independent, and we do not guarantee exam success. CFA Institute does not endorse, promote, or warrant the accuracy or quality of our products.