Time Series Analysis with Exponential Smoothing

Chris Grannan
7 min readMar 27, 2021

--

In the last several posts, I have covered the basics of time series analysis, the creation of ARIMA models, and the creation of SARIMA models. For this post, I wanted to cover a different family of time series models, exponential smoothing methods. First, I will explain what exponential smoothing is and go over some of the different model types, then we will walk through a quick example of how to construct a model to analyze a time series of beer production.

What is Exponential Smoothing?

Just like with an ARIMA model, exponential smoothing models create a linear set of predictions based on weighted sums of previous values. Unlike ARIMA models, time series do not need to be stationary for analysis when using exponential smoothing. In addition, exponential smoothing models generally have fewer parameters to tune meaning they are generally quicker to implement. Rather than using values to account for autoregression, integration and moving averages, we simply assign weights to previous values that get exponentially smaller as the time gap increases. this means that more recent data points will have a stronger impact on predictions. As a whole, these models are not necessarily better or worse than ARIMA models, but should be used as an alternative. In most cases, it is better to try both types of models and see which fits your data best without overfitting.

Types of Exponential Smoothing Models:

There are three levels of exponential smoothing models: simple exponential smoothing, double exponential smoothing also called Holt’s method, and triple exponential smoothing called holt-Winter’s method.

Simple exponential smoothing does not support data that has trend or seasonality. This method can only be used on stationary data, but is very easy to implement as it only has one parameter, the smoothing factor, to be tuned. This parameter is represented as ⍺ and controls how quickly the weights of previous values decay. When ⍺ is closer to 0, the weights will drop very slowly, meaning the predictions will be impacted more by older data points; but, when ⍺ is closer to 1, the weights will decrease very quickly and predictions will mostly be impacted by more recent data points.

Holt’s method for double exponential smoothing works with data that has trend, but not with data that has seasonality. This model uses two parameters, the smoothing factor for the level represented as ⍺, and the smoothing factor for the trend of the series represented as β. These parameters act in much the same way as the smoothing factor in simple exponential smoothing, a value closer to zero will give greater weight to older data points and a value closer to 1 will give a greater weight to more recent values. When constructing a double exponential smoothing model, we also need to specify whether the trend is additive or multiplicative. An additive trend will use the summation of weights while a multiplicative trend will use the product instead. In addition, with double exponential smoothing we can also set a dampening parameter, represented as ɸ. The dampening parameter is useful when dealing with long term forecasts. Typically a double exponential model will have extreme predictions as more time passes and the predictions continue to increase or decrease indefinitely. The dampening parameter puts a check on the output of a model and keeps the predictions from increasing or decreasing too rapidly by limiting the effect of the trend of future predictions. The dampening effect can also be additive or multiplicative depending on how much we want to limit the effect of trend on the predictions.

Finally, we come to the last of the three types of exponential smoothing, Holt-Winter’s model for exponential smoothing. This model takes a model for double exponential smoothing and incorporates seasonality. We introduce a new parameter, the smoothing factor for seasonality represented as 𝛾. This parameter works in the same way as the β term. A value closer to 0 will give greater weights to older observations, and a value closer to 1 will give more weight to more recent observations. And again, this factor can be additive or multiplicative in reference to how the weights are aggregated. For triple exponential smoothing we also need to assign the number of time steps in a seasonal period. This works the same way as when building a SARIMA model. For example, if we are looking at daily data with weekly seasonal periods we use 7. If, instead, we have monthly data with yearly seasons, we would use 12.

Constructing Exponential Smoothing Models in Python:

Now that we know what to look for when building an exponential smoothing model, let’s look at a quick example. For this example we will be looking at beer production rates by month, and for a closer look at the code involved, you can find this example on my GitHub page.

As always, to start we will load in the appropriate libraries.

# Import libraries
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
sns.set_style('darkgrid')

Next, we will load in and prepare our data.

# Load data
df = pd.read_csv('monthly-beer-production-in-austr.csv')
# Change Month column to datetime and set as index
df.Month = pd.to_datetime(df.Month)
data = df.set_index('Month')

Here is what the time series looks like:

Clearly the time series is not stationary. We see evidence of seasonality and the trend increases over time. If we were working with an ARIMA model, we would need to address these issues, but they don’t pose a problem with exponential smoothing models. From the presence of trend and seasonality, we will need to use triple exponential smoothing to model this series, but we will run through the process for all three methods of exponential smoothing.

First up is simple exponential smoothing. We can find this model in the Statsmodels library. Once we import (see code block below) this function, we create a model and pass our data into it. We then call .fit on our model and pass in a value for the smoothing factor. If we don’t pass in an alpha level, the function will autofit the model (recommended approach) and we can find out the parameters of our model by calling model.params. Finally, we use the fitted result to forecast values. Here is what this all looks like in Python.

# Import function
from statsmodels.tsa.holtwinters import SimpleExpSmoothing
# Create and fit model, and get predictions for 60 months (5 years)
model = SimpleExpSmoothing(data)
fit = model.fit()
fcast = fit.forecast(60)
# Plot time series and forecasted predictions
fig = plt.figure(figsize=(14,6))
plt.plot(data, color='black', label='Observed')fcast.plot(color='red', label='Forecast')plt.title('Simple Exponential Smoothing Predictions')
plt.xlabel('Date')
plt.ylabel('Kegs Produced')
plt.legend()
plt.show()

We can tell that this model does not fit the data particularly well since it just levels off. Let’s try using double exponential smoothing and see if we can improve our predictions. For double exponential models we use the Holt function from Statsmodels. We build this model out just like we did for the simple exponential smoothing model. For this example we will build a linear (standard) trend model and a dampened trend model. Both models will use an additive trend, but if we wanted to use a multiplicative trend we would pass trend= “mul” to the Holt() function. We will autofit the models for the example, but if you wanted to pass a particular ⍺ or β level, you would set smoothing_level to ⍺ and smoothing_slope to β in the .fit() method.

# Import function
from statsmodels.tsa.holtwinters import Holt
# Create and fit a linear model and a dampened model
model = Holt(data)
fit = model.fit()
model2 = Holt(data, damped=True)
fit2 = model2.fit()
# Get predictions
fcast = fit.forecast(60)
fcast2 = fit2.forecast(60)
# Plot time series and the preictions
fig = plt.figure(figsize=(14,6))
plt.plot(data, color='black',)fcast.plot(color='red', label='Linear Trend', legend=True)
fcast2.plot(color='green', label='Dampened Trend', legend=True)
plt.show()

Our predictions are getting a bit better. We can see the effect of the trend in our predictions here. Our data had a strong upward trend for a while but has been slowly dwindling more recently. In the dampened predictions we see results similar to the simple exponential smoothing model. Now let’s try a triple exponential smoothing model. We will need the ExponentialSmoothing function for this model and find it in the same module as the others. For this example, we will build an additive model, but you can change this to a multiplicative model for both trend and seasonality by swapping ‘add’ to ‘mul’.

# Import function
from statsmodels.tsa.holtwinters import ExponentialSmoothing
# Create and fit an additive model
model = ExponentialSmoothing(data, trend='add', seasonal='add', seasonal_periods=12).fit()
# Get Predictions
fcast = model.forecast(60)
fcast2 = model2.forecast(60)
# Plot time series and predictions
fig = plt.figure(figsize=(14,8))
plt.plot(data, color='black')
fcast.plot(color='red', label='Additive Trend and Seasonality')plt.title('Triple Exponential Smoothing Results')
plt.xlabel('Date')
plt.ylabel('Kegs Produced')
plt.legend()
plt.show()

That’s better! Now we can see the seasonal effect on our predictions and a small negative trend over a five year period. I hope this tutorial has been helpful in how to create exponential smoothing models for time series analysis.

Resources:

For a closer look at the demonstration, you can find it on my GitHub.

For more information on the statistics behind exponential smoothing, please check out this book.

--

--

No responses yet