# Time Series
# Plotting
import holoviews as hv
import hvplot
import hvplot.pandas # noqa
import panel as pn
import statsmodels # noqa
from scipy import stats
from statsmodels.tsa.arima_process import ArmaProcess
from statsmodels.tsa.seasonal import STL, seasonal_decompose
from statsmodels.tsa.stattools import acf
# Activate hvplot extension
hvplot.extension("bokeh")
# Analysis
import numpy as np
import pandas as pd
# Utility
import watermark
Unlocking repeatable patterns in time series data is crucial for accurate forecasting and for advanced industrial diagnostics. This guide provides a practical guide to signal decomposition, demystifying how to confidently identify and extract seasonality from your data.
The post demonstrates how Autocorrelation Function (ACF) plots can be used to not only characterise signals but also to estimate the periodicity of seasonal trends. This insight is then leveraged in a hands-on Python walkthrough using the statsmodels library. We compare the seasonal_decompose and the more robust STL functions to effectively isolate a signal’s Trend, Seasonal, and Residual components.
Crucially, the analysis moves beyond standard forecasting examples to highlight the utility of these techniques in industrial applications, such as monitoring machinery behaviour to understand its underlying condition.
Introduction
The purpose of this guide is to explore signal decomposition specifically with time series data. Two common methods are to establish the variation in different frequencies using techniques such as Fast Fourier Transforms (FFT) and the second is Seasonal Decomposition. The latter decomposes a signal into Trend, Seasonal and Residual components. The focus of this guide is utilising this technique to confidently extract the Seasonal component of a signal.
The common motivation to de-trend and de-seasonal a signal is that it simplifies potential time series forecasting. The popularity of this subject in and its association with Machine Learning obscures its utility to industrial applications from hardware monitoring, diagnosis and ultimately failure/condition prediction.
This guide shows how two interrelated time series analysis concepts of Autocorrelation [1] and Seasonality [1] can be utilised to eliminate known behaviour of industrial machines (such as current profiles in electric motors) to reveal the underlying state.
In the Python ecosystem the statsmodels package contains three functions of interest and will be demonstrated in this guide: acf, seasonal_decompose and STL.
Environment Setup
Create a conda environment using the mamba [2] tool (or a virtual environment manager of your choice) and install a Python 3.12 environment. Install the following initial packages:
pipsktimeseabornpmdarimastatsmodelsnumba
# Create a new environment
mamba create -n sktime python=3.12
Looking for: ['python=3.12']
conda-forge/linux-64 Using cache
conda-forge/noarch Using cache
Transaction
Prefix: /home/miah0x41/mambaforge/envs/sktime
Updating specs:
- python=3.12
Package Version Build Channel Size
─────────────────────────────────────────────────────────────────────────────
Install:
─────────────────────────────────────────────────────────────────────────────
+ ld_impl_linux-64 2.43 h712a8e2_4 conda-forge Cached
+ _libgcc_mutex 0.1 conda_forge conda-forge Cached
+ libgomp 15.1.0 h767d61c_2 conda-forge Cached
+ _openmp_mutex 4.5 2_gnu conda-forge Cached
+ libgcc 15.1.0 h767d61c_2 conda-forge Cached
+ ncurses 6.5 h2d0b736_3 conda-forge Cached
+ libzlib 1.3.1 hb9d3cd8_2 conda-forge Cached
+ liblzma 5.8.1 hb9d3cd8_1 conda-forge Cached
+ libgcc-ng 15.1.0 h69a702a_2 conda-forge Cached
+ libffi 3.4.6 h2dba641_1 conda-forge Cached
+ libexpat 2.7.0 h5888daf_0 conda-forge Cached
+ readline 8.2 h8c095d6_2 conda-forge Cached
+ libsqlite 3.49.2 hee588c1_0 conda-forge Cached
+ tk 8.6.13 noxft_h4845f30_101 conda-forge Cached
+ libxcrypt 4.4.36 hd590300_1 conda-forge Cached
+ bzip2 1.0.8 h4bc722e_7 conda-forge Cached
+ libuuid 2.38.1 h0b41bf4_0 conda-forge Cached
+ libnsl 2.0.1 hd590300_0 conda-forge Cached
+ tzdata 2025b h78e105d_0 conda-forge Cached
+ ca-certificates 2025.4.26 hbd8a1cb_0 conda-forge Cached
+ openssl 3.5.0 h7b32b05_1 conda-forge Cached
+ python 3.12.10 h9e4cc4f_0_cpython conda-forge Cached
+ wheel 0.45.1 pyhd8ed1ab_1 conda-forge Cached
+ setuptools 80.8.0 pyhff2d567_0 conda-forge Cached
+ pip 25.1.1 pyh8b19718_0 conda-forge Cached
Summary:
Install: 25 packages
Total download: 0 B
─────────────────────────────────────────────────────────────────────────────
Confirm changes: [Y/n]
Downloading and Extracting Packages:
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
To activate this environment, use
$ mamba activate sktime
To deactivate an active environment, use
$ mamba deactivate
# Activate environment
mamba activate sktime
# Install sktime
mamba install -c conda-forge pip sktime seaborn pmdarima statsmodels numba
Looking for: ['pip', 'sktime', 'seaborn', 'pmdarima', 'statsmodels', 'numba']
conda-forge/linux-64 Using cache
conda-forge/noarch Using cache
Pinned packages:
- python 3.12.*
Transaction
Prefix: /home/miah0x41/mambaforge/envs/sktime
Updating specs:
- pip
- sktime
- seaborn
- pmdarima
- statsmodels
- numba
- ca-certificates
- openssl
Package Version Build Channel Size
────────────────────────────────────────────────────────────────────────────────
Install:
────────────────────────────────────────────────────────────────────────────────
+ libstdcxx 15.1.0 h8f9b012_2 conda-forge 4MB
+ libpng 1.6.47 h943b412_0 conda-forge 289kB
+ libgfortran5 15.1.0 hcea5267_2 conda-forge 2MB
+ libjpeg-turbo 3.1.0 hb9d3cd8_0 conda-forge 629kB
+ libwebp-base 1.5.0 h851e524_0 conda-forge 430kB
+ pthread-stubs 0.4 hb9d3cd8_1002 conda-forge 8kB
+ xorg-libxdmcp 1.1.5 hb9d3cd8_0 conda-forge 20kB
+ xorg-libxau 1.0.12 hb9d3cd8_0 conda-forge 15kB
+ libdeflate 1.24 h86f0d12_0 conda-forge 73kB
+ libbrotlicommon 1.1.0 hb9d3cd8_2 conda-forge Cached
+ zstd 1.5.7 hb8e6e7a_2 conda-forge 568kB
+ lerc 4.0.0 h0aef613_1 conda-forge 264kB
+ libstdcxx-ng 15.1.0 h4852527_2 conda-forge 35kB
+ libfreetype6 2.13.3 h48d6fc4_1 conda-forge 380kB
+ libgfortran 15.1.0 h69a702a_2 conda-forge 35kB
+ libxcb 1.17.0 h8a09558_0 conda-forge 396kB
+ libbrotlienc 1.1.0 hb9d3cd8_2 conda-forge Cached
+ libbrotlidec 1.1.0 hb9d3cd8_2 conda-forge Cached
+ libtiff 4.7.0 hf01ce69_5 conda-forge 430kB
+ qhull 2020.2 h434a139_5 conda-forge 553kB
+ libfreetype 2.13.3 ha770c72_1 conda-forge 8kB
+ libopenblas 0.3.29 pthreads_h94d23a6_0 conda-forge Cached
+ brotli-bin 1.1.0 hb9d3cd8_2 conda-forge 19kB
+ openjpeg 2.5.3 h5fbd93e_0 conda-forge 343kB
+ lcms2 2.17 h717163a_0 conda-forge 248kB
+ freetype 2.13.3 ha770c72_1 conda-forge 172kB
+ libblas 3.9.0 31_h59b9bed_openblas conda-forge Cached
+ brotli 1.1.0 hb9d3cd8_2 conda-forge 19kB
+ libcblas 3.9.0 31_he106b2a_openblas conda-forge Cached
+ liblapack 3.9.0 31_h7ac8fdf_openblas conda-forge Cached
+ python_abi 3.12 7_cp312 conda-forge 7kB
+ packaging 25.0 pyh29332c3_1 conda-forge 62kB
+ scikit-base 0.12.2 pyhecae5ae_0 conda-forge 110kB
+ joblib 1.4.2 pyhd8ed1ab_1 conda-forge Cached
+ threadpoolctl 3.6.0 pyhecae5ae_0 conda-forge 24kB
+ pytz 2025.2 pyhd8ed1ab_0 conda-forge 189kB
+ python-tzdata 2025.2 pyhd8ed1ab_0 conda-forge 144kB
+ cycler 0.12.1 pyhd8ed1ab_1 conda-forge 13kB
+ pyparsing 3.2.3 pyhd8ed1ab_1 conda-forge 96kB
+ munkres 1.1.4 pyh9f0ad1d_0 conda-forge Cached
+ hpack 4.1.0 pyhd8ed1ab_0 conda-forge Cached
+ hyperframe 6.1.0 pyhd8ed1ab_0 conda-forge Cached
+ pysocks 1.7.1 pyha55dd90_7 conda-forge Cached
+ six 1.17.0 pyhd8ed1ab_0 conda-forge Cached
+ pycparser 2.22 pyh29332c3_1 conda-forge Cached
+ h2 4.2.0 pyhd8ed1ab_0 conda-forge Cached
+ python-dateutil 2.9.0.post0 pyhff2d567_1 conda-forge Cached
+ unicodedata2 16.0.0 py312h66e93f0_0 conda-forge 404kB
+ brotli-python 1.1.0 py312h2ec8cdc_2 conda-forge Cached
+ pillow 11.2.1 py312h80c1187_0 conda-forge 43MB
+ kiwisolver 1.4.8 py312h84d6215_0 conda-forge 72kB
+ llvmlite 0.44.0 py312h374181b_1 conda-forge 30MB
+ cython 3.1.1 py312h2614dfc_1 conda-forge 4MB
+ numpy 2.2.6 py312h72c5963_0 conda-forge 8MB
+ cffi 1.17.1 py312h06ac9bb_0 conda-forge Cached
+ fonttools 4.58.0 py312h178313f_0 conda-forge 3MB
+ contourpy 1.3.2 py312h68727a3_0 conda-forge 277kB
+ scipy 1.15.2 py312ha707e6e_0 conda-forge 17MB
+ pandas 2.2.3 py312hf9745cd_3 conda-forge 15MB
+ numba 0.61.2 py312h2e6246c_0 conda-forge 6MB
+ zstandard 0.23.0 py312h66e93f0_2 conda-forge 732kB
+ matplotlib-base 3.10.3 py312hd3ec401_0 conda-forge 8MB
+ scikit-learn 1.6.1 py312h7a48858_0 conda-forge 11MB
+ sktime 0.36.0 py312h7900ff3_0 conda-forge 34MB
+ patsy 1.0.1 pyhd8ed1ab_1 conda-forge 187kB
+ urllib3 2.4.0 pyhd8ed1ab_0 conda-forge 101kB
+ seaborn-base 0.13.2 pyhd8ed1ab_3 conda-forge 228kB
+ statsmodels 0.14.4 py312hc0a28a1_0 conda-forge 12MB
+ pmdarima 2.0.4 py312h41a817b_2 conda-forge 663kB
+ seaborn 0.13.2 hd8ed1ab_3 conda-forge 7kB
Summary:
Install: 70 packages
Total download: 205MB
────────────────────────────────────────────────────────────────────────────────
Confirm changes: [Y/n]
libpng 288.7kB @ 552.8kB/s 0.5s
libgfortran5 1.6MB @ 3.0MB/s 0.5s
libjpeg-turbo 628.9kB @ 1.2MB/s 0.5s
lerc 264.2kB @ 450.8kB/s 0.1s
libstdcxx-ng 34.6kB @ 57.8kB/s 0.1s
libfreetype6 380.1kB @ 626.1kB/s 0.1s
libstdcxx 3.9MB @ 6.3MB/s 0.6s
libwebp-base 430.0kB @ 689.0kB/s 0.6s
openjpeg 343.0kB @ 538.3kB/s 0.1s
lcms2 248.0kB @ 346.6kB/s 0.1s
freetype 172.4kB @ 240.1kB/s 0.1s
threadpoolctl 23.9kB @ 33.2kB/s 0.1s
pytz 189.0kB @ 261.6kB/s 0.1s
contourpy 276.5kB @ 268.9kB/s 0.3s
cython 3.7MB @ 2.9MB/s 0.6s
fonttools 2.8MB @ 2.1MB/s 0.6s
pthread-stubs 8.3kB @ 6.0kB/s 0.1s
libdeflate 72.6kB @ 51.7kB/s 0.1s
libxcb 395.9kB @ 270.8kB/s 0.1s
libtiff 429.6kB @ 280.9kB/s 0.1s
python_abi 7.0kB @ 4.5kB/s 0.1s
python-tzdata 144.2kB @ 89.0kB/s 0.1s
unicodedata2 404.4kB @ 243.3kB/s 0.1s
numpy 8.5MB @ 4.9MB/s 1.0s
zstandard 732.2kB @ 406.8kB/s 0.1s
seaborn-base 227.8kB @ 121.4kB/s 0.1s
xorg-libxdmcp 19.9kB @ 10.1kB/s 0.1s
libgfortran 34.5kB @ 16.5kB/s 0.1s
brotli-bin 18.9kB @ 8.6kB/s 0.1s
scikit-base 110.2kB @ 45.8kB/s 0.2s
scipy 17.1MB @ 4.9MB/s 1.9s
llvmlite 30.0MB @ 8.6MB/s 2.8s
patsy 186.6kB @ 52.9kB/s 0.0s
seaborn 6.9kB @ 1.9kB/s 0.0s
qhull 552.9kB @ 151.8kB/s 0.1s
packaging 62.5kB @ 16.9kB/s 0.0s
kiwisolver 71.6kB @ 18.9kB/s 0.1s
numba 5.9MB @ 1.5MB/s 0.5s
xorg-libxau 14.8kB @ 3.7kB/s 0.1s
statsmodels 12.1MB @ 3.0MB/s 3.0s
brotli 19.3kB @ 4.7kB/s 0.1s
pmdarima 663.1kB @ 158.9kB/s 0.1s
cycler 13.4kB @ 3.2kB/s 0.1s
urllib3 100.8kB @ 23.3kB/s 0.1s
scikit-learn 10.6MB @ 2.4MB/s 0.6s
pyparsing 96.0kB @ 21.9kB/s 0.0s
libfreetype 7.7kB @ 1.7kB/s 0.0s
zstd 567.6kB @ 126.1kB/s 0.1s
matplotlib-base 8.2MB @ 1.6MB/s 0.6s
pillow 42.5MB @ 8.5MB/s 2.6s
pandas 15.4MB @ 2.8MB/s 1.5s
sktime 34.1MB @ 3.8MB/s 7.5s
Downloading and Extracting Packages:
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
# Install Skchange
pip install skchange[numba]
Collecting skchange[numba]
Downloading skchange-0.13.0-py3-none-any.whl.metadata (5.6 kB)
Requirement already satisfied: numpy>=1.21 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from skchange[numba]) (2.2.6)
Requirement already satisfied: pandas>=1.1 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from skchange[numba]) (2.2.3)
Requirement already satisfied: sktime>=0.35 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from skchange[numba]) (0.36.0)
Requirement already satisfied: numba<0.62,>=0.61 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from skchange[numba]) (0.61.2)
Requirement already satisfied: llvmlite<0.45,>=0.44.0dev0 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from numba<0.62,>=0.61->skchange[numba]) (0.44.0)
Requirement already satisfied: python-dateutil>=2.8.2 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from pandas>=1.1->skchange[numba]) (2.9.0.post0)
Requirement already satisfied: pytz>=2020.1 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from pandas>=1.1->skchange[numba]) (2025.2)
Requirement already satisfied: tzdata>=2022.7 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from pandas>=1.1->skchange[numba]) (2025.2)
Requirement already satisfied: six>=1.5 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from python-dateutil>=2.8.2->pandas>=1.1->skchange[numba]) (1.17.0)
Requirement already satisfied: joblib<1.5,>=1.2.0 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from sktime>=0.35->skchange[numba]) (1.4.2)
Requirement already satisfied: packaging in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from sktime>=0.35->skchange[numba]) (25.0)
Requirement already satisfied: scikit-base<0.13.0,>=0.6.1 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from sktime>=0.35->skchange[numba]) (0.12.2)
Requirement already satisfied: scikit-learn<1.7.0,>=0.24 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from sktime>=0.35->skchange[numba]) (1.6.1)
Requirement already satisfied: scipy<2.0.0,>=1.2 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from sktime>=0.35->skchange[numba]) (1.15.2)
Requirement already satisfied: threadpoolctl>=3.1.0 in /home/miah0x41/mambaforge/envs/sktime/lib/python3.12/site-packages (from scikit-learn<1.7.0,>=0.24->sktime>=0.35->skchange[numba]) (3.6.0)
Downloading skchange-0.13.0-py3-none-any.whl (18.4 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 18.4/18.4 MB 8.6 MB/s eta 0:00:00
Installing collected packages: skchange
Successfully installed skchange-0.13.0Additional recommend packages are:
If using a Jupyter environment then the ipykernel package is recommended; register the new environment (and in this case called sktime) with the following:
# Register kernel
python -m ipykernel install --user --name sktime --display-name "sktime (py3.12)"
Installed kernelspec sktime in /home/miah0x41/.local/share/jupyter/kernels/sktimeSystem Details
Import packages:
# System version
print(watermark.watermark())Last updated: 2025-09-03T09:47:02.624437+01:00
Python implementation: CPython
Python version : 3.12.10
IPython version : 9.2.0
Compiler : GCC 13.3.0
OS : Linux
Release : 6.6.87.2-microsoft-standard-WSL2
Machine : x86_64
Processor : x86_64
CPU cores : 20
Architecture: 64bit
# Package versions
print(watermark.watermark(iversions=True, globals_=globals()))watermark : 2.5.0
scipy : 1.15.2
panel : 1.7.0
hvplot : 0.11.3
numpy : 2.2.6
statsmodels: 0.14.4
pandas : 2.2.3
holoviews : 1.20.2
Example Time Series
Useful Functions
The following functions were used to generate and analyse time series data with various characteristics.
Generate six types of time series:
- White Noise
- Trend
- Seasonal
- Auto Regressive order 1 (AR)
- Moving Average order 2 (MA)
- Dummy Stock Price
def create_example_time_series(n: int = 200) -> pd.DataFrame:
"""Creates a Pandas DataFrame containing example time series data.
Args:
n: The length of each time series.
Returns:
A DataFrame with columns 'White_Noise', 'Trend', 'Seasonal', 'AR1', 'MA2', and 'Stock_Price'.
"""Extract the Autocorrelation of a series and a time series preview plot.
def analyze_acf_hvplot(
series: pd.Series,
title: str,
lags: int = 20,
) -> tuple[hvplot.plotting.hvPlot, np.ndarray]:
"""Calculate and plot ACF with interpretation using hvplot.
Args:
series: The time series to analyze.
title: The title for the plots.
lags: The number of lags to compute the ACF for.
Returns:
A tuple containing the hvplot of the time series (first 100 steps) and the ACF values.
"""A simplified Autocorrelation Function plot with confidence intervals around the autocorrelation.
def plot_acf_simple_v2(
x: pd.Series, lags: int = 20, alpha: float = 0.05, title: str | None = None
) -> hv.Overlay:
"""Simplified version of ACF plotting function with proper confidence intervals.
Args:
x: The time series to analyze.
lags: The number of lags to compute the ACF for.
alpha: The significance level for the confidence interval.
title: The title for the plot.
Returns:
A HoloViews plot of the Autocorrelation Function.
"""Combine the simple time series and ACF plots.
def combine_plots(p1: hv.element.chart.Curve | hv.core.overlay.Overlay, p2: hv.element.chart.Curve | hv.core.overlay.Overlay) -> pn.layout.Column:
"""Combines two plots into a Panel layout.
Args:
p1: The first plot (HoloViews Curve, Overlay, or Panel HoloViews pane).
p2: The second plot (HoloViews Curve, Overlay, or Panel HoloViews pane).
Returns:
A Panel Column layout containing the two plots.
"""Create a dataset with the different types of time series data:
# Generate example dataset
df = create_example_time_series(200)
# Preview
df.head()| White_Noise | Trend | Seasonal | AR1 | MA2 | Stock_Price | |
|---|---|---|---|---|---|---|
| 0 | 0.496714 | 0.178894 | -0.797214 | 0.756989 | 0.938284 | 102.941132 |
| 1 | -0.138264 | 0.330644 | 1.200312 | -0.392273 | -0.046903 | 104.967415 |
| 2 | 0.647689 | 0.642028 | 2.600698 | 0.595015 | 0.119584 | 105.197820 |
| 3 | 1.523030 | 0.677655 | 3.023490 | 1.772148 | -0.569028 | 103.949358 |
| 4 | -0.234153 | -0.487830 | 2.373043 | 1.653939 | -0.636798 | 105.516602 |
Generate the individual plots.
# Store the ACF values and associated plots
p_ts = {}
p_acf = {}
v_acf = {}
# For each type of time series
for c in df.columns:
# Get ACF data and time series plot
(p_ts[c], v_acf[c]) = analyze_acf_hvplot(df[c], c)
# Store results
p_acf[c] = plot_acf_simple_v2(
df[c], lags=20, title=f"Autocorrelation Function: {c}"
)White Noise & Stationarity
The term White Noise is intended to denote a signal that lacks a discernable pattern. The figure below shows a preview of the first 100 time steps of the data and the results of the Autocorrelation Function against time step Lags. The Autocorrelation Function measures the linear relationship between a time series and its own past values at different time intervals. It quantifies how much the current value of a series is correlated with its previous values.
Lag refers to the time displacement or delay between observations in the time series. It represents how many time periods back we’re looking when comparing values. So Lag 0 means a correlation with itself and hence always 1. A Lag of 1 is therefore a correlation between the current time step and the previous time step.
The region represented by the blue shade shows the 95% Confidence Interval as a surrogate for aiding the reader on whether the correlations are real or within the noise. We require the confidence interval around the ACFs not to cross the zero ACF axis. Otherwise this implies the the ACF could be zero and we couldn’t confidently state that its different from no discernable pattern.
White Noise Autocorrelation Function (ACF) Evaluation
The ACF plot clearly shows that a signal intended to represent White Noise has no discernable correlation with itself and the calculated ACFs are within the confidence interval of not being sufficiently significant. This is a good baseline of recognising when no (easy) pattern is discernable from the data.
The confidence levels have been plotted as a function of the ACF rather than the approach used by the statsmodels.graphics.tsaplots.plot_acf where they are grounded at 0. This is why the plot may look unfamilar and the interpretation must be adjusted accordingly.
There is a key assumption with these plots, in that the time series is representative of a period of interest that is undergoing no change; in practice imagine monitoring a car engine. If we analyse data where the time period includes both an acceleration as well as steady constant speed operation, then clearly the time series is experiencing a change in the underlying behaviour and therefore these techniques will produce invalid results. This should be “common sense” or to be more precise, principles taught to school children about making “fair comparisons” in the field of Science.
With this in mind, the concept of Stationarity, whilst a useful term in the field of Functional Analysis is a terrible term for any practical application in the field of Engineering because it makes a simple concept sound more complicated than it needs to be. Much of the material on this topic encourages the user to conduct tests to determine if the time series is stationary via common methods:
- Augmented Dickey-Fuller (ADF) Test [3]
- Kwiatkowski-Phillips-Schmidt-Shin (KPSS) Test [4]
- Phillips-Perron Test [5]
A stationary time series is only a requirement for the Autocorrelation values to be valid, whereas our intention is to use the ACF to explicitly highlight non-stationary behaviour such as trending and seasonality. There is a misconception that techniques such as differencing [6] need to be applied before the ACF is calculated.
A useful supplementary technique to ensure the time period meets this requirement is the use of Segmentation to segregate a time period of interest that isolates the data from unwanted changes in the system state. This means we isolate accelerations from steady state operations, even though an accelleration is a dynamic event.
Trend
In the figure below, the sample time series plot (top) shows that the data is trending upwards as a function of time with some noise. This is represented in the ACF plot as a slight exponential reduction with lag. Note that the confidence level is relativly significant reinforcing the message that as a non-stationary time series, the ACF values are more difficult to interpret and apply.
In this instance, independent of the validity of the ACF values themselves, we are using the trend in ACF values to characterise that the time series has a trend component.
Seasonal
The sample time series plot (top) shows that the data has a clear Seasonal variation combined with random noise. The peak-to-peak values appear to be increasing as well. The ACF plot, despite the confidence interval shows the same seasonality as the raw data.
There are three key observations from the ACF plot itself. The first as highlighted is that its clear there is a seasonal variation. Secondly the magnitude of the ACF values decreases with the number of lags hinting at a trend component as well. Finally, one of the most useful components from this plot, is that we are able to estimate the period of the seasonality i.e. when does the signal appear to repeat itself. It is for this characteristic we’ll use the ACF plots to help us then to ultimately decompose the signal later into its trend, seasonality and residuals aspects.
Auto Regressive (AR)
The “Auto” means there is a relationship with itself and that it uses its own past values (i.e. regressive). We would expect a linear relationship with its past values and those values impact the current value in a systematic way. Practical examples often cited are Stock Prices, where one influencing factor is the previous day’s price (but can be heaviliy disputed) and outside temperatures (again depends on previous day but also other climatic factors). Engineering examples may include certain types of wear on friction materials used for braking, where the loss of material as a function of time could be modelled based on previous levels of wear.
It is not possible to determine the order of the Auto Regressive times series based on an ACF plot. The order is in effect which time period influences the current one, and by design this example is order 1. From the plot above we can see that the trend in ACF is gradual until a lag of 4 where the ACF is no different to zero at 95% confidence. We require a different technique called Partial Autocorrelation Function (PACF) [7], which is out of scope for this guide.
Moving Average (MA)
This dummy time series is generated from the average or mean values using a window size of 2 time steps. Such a series is typically used for smoothing as a predicate for forecasting.
Unlike the previous example, the ACF plot illustrates a cutoff of 2 i.e. lags beyond this value are not distinguishable from zero, which we know from above means that the signal beyond this is not distinguishable from White Noise.
Stock Price
This dummy series is meant to illustrate an example stock price ticker:
The techniques we’ve developed so far would suggest that based on the ACF there is a strong trend component. No obvious Auto Regressive or Moving Average behaviour. Furthermore, the ACF values are distinct with sufficient difference at the 95% confidence level to be greater than 0 so not White Noise either. The gradual downward trend for the ACF with Lag indicates a strong trend component, which the sample of the observed data on the top plot supports.
Summary of Autocorrelation Function
In this section, we briefly summarise and compare the change in ACF as a function of Lag.
# Combine ACFs
df_acf = pd.DataFrame(v_acf)
# Preview
df_acf.head()| White_Noise | Trend | Seasonal | AR1 | MA2 | Stock_Price | |
|---|---|---|---|---|---|---|
| 0 | 1.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 |
| 1 | -0.051568 | 0.954620 | 0.822161 | 0.661165 | 0.510537 | 0.983445 |
| 2 | -0.038837 | 0.942961 | 0.469831 | 0.450068 | 0.238493 | 0.968771 |
| 3 | 0.002244 | 0.928043 | 0.002408 | 0.273582 | 0.046783 | 0.953805 |
| 4 | -0.120266 | 0.915175 | -0.472547 | 0.161717 | 0.083354 | 0.938170 |
It’s easy to miss that the majority of the series we’ve examined have positive ACF values with the exception of the Seasonal time series. It also makes it easy to determine that the behaviour of the Stock Price most closely resembles the Trend series.
Seasonal Decomposition
We’ve established that an Autocorrelation Function plot can confirm the presence of a Seasonality and that we can estimate its period.
In this section we introduce statmodels’s seasonal_decompose with a view of extracting three components:
- Trend
- Seasonality
- Residuals
White Noise
To establish a baseline, we attempt to extract these components from the White Noise using the default model of additive and a guess for the period:
# Decompose signal
results = seasonal_decompose(df["White_Noise"], model="additive", period=10)# Create dataframe
results_df = pd.DataFrame(
{
"Observed": results.observed,
"Trend": results.trend,
"Seasonal": results.seasonal,
"Residuals": results.resid,
}
)The residuals are calculated by taking the observed signal and removing both the seasonality and trend, in effect the component that cannot be accounted for. Therefore, the overall magnitude of the residuals should be small for a strongly trending and/or seasonal signal. We can derive a metric that crudely reflects how much of the original signal remains as the residuals as a means of determining how well the seasonality and trend have been extracted or exist within the original signal.
A value of 1 means that the entirity of the observed signal is now the residuals, i.e. no trend or seasonality and a value of 0 means no residuals, i.e. the signal can be fully accounted for by the trend and/or seasonality components.
# Residual Ratio
residual_ratio = (results.resid.max() - results.resid.min()) / (
results.observed.max() - results.observed.min()
)The plot shows the majority of the observed data remains unaccounted for based on the high ratio (0.962) but also the similarity in range and scope of the two plots:
# Columns of interest
cols = ["Observed", "Residuals"]
# Compare Observed and Residuals
plot = results_df.loc[:, cols].hvplot(
xlabel="Time Step",
ylabel="Value",
padding=0.1,
grid=True,
title="Comparison of Observed and Residuals",
)Seasonal
We would expect the function to be more effective with this signal than the White Noise. From the ACF plot we’d expect a periodicity of 12, although we should conduct a sensitivy check to confirm.
The decomposition method appears to be very effective in identifying and extracting the seasonal component. Note that the ratio in the peak-to-peak range between the original and residuals numerically appears to be significant at 0.319.
Whilst the magnitude of the the residuals appears to be significant, a direct comparison shows its relatively small. Whether they are significant or not depends on your use case. In most practical examples such as engine performance its sufficient to have indicative values to mitigate measurement related issues.
Seasonal Using STL
The current seasonal_decompose is relatively simple; a more advanced function is statsmodels.tsa.seasonal.STL. In this section we can decompose the signal and compare with the previous approach.
# Decompose signal
results_stl = STL(df["Seasonal"], period=12).fit()The decomposition method appears to be more effective in identifying and extracting the seasonal component. Note that the ratio in the peak-to-peak range between the original and residuals numerically is lower than the seaonsal_decompose method of 0.319 vs 0.235.
This is also reflected in the comparison plot below:
We can compare the seasonal component of both methods:
# Combine dataframes
combined = pd.concat([results_df["Seasonal"], results_stl_df["Seasonal"]], axis=1)
# Rename columns
combined.columns = ["seasonal_decompose", "STL"]# Compare Observed and Residuals
plot = combined.hvplot(
xlabel="Time Step",
ylabel="Value",
padding=0.1,
grid=True,
title="Comparison of Two Decomposition Methods",
legend="top",
height=400,
)The STL method has altering peak-to-peak values per cycle and asymmetrical peaks. The residuals have a small difference:
Periodicity Sensitivity
For both assessents we determined the period to be 12 based on a visual evaluation of the ACF plots. The following method conducts a sensitivity assessment to determine if this was correct:
# Store residuals
residuals = []
# For periods from 2 to 24
for period in range(23):
# Decompose signal
results = STL(df["Seasonal"], period=period + 2).fit()
# Calculate residual ratio
r = (results.resid.max() - results.resid.min()) / (
results.observed.max() - results.observed.min()
)
# Calculate trend ratio
t = (results.trend.max() - results.trend.min()) / (
results.observed.max() - results.observed.min()
)
# Store output
residuals.append(
{
"period": period + 2,
"residual_ratio": r,
"trend_ratio": t,
}
)# Create dataframe
metrics = pd.DataFrame(residuals)If we examine the residual_ratio at first, we see three potential candidates for a suitable periodicity: 2, 12 and 23. However, this can be misleading because with a periodicity of 2, the trend_ratio is high leading to a low set of residuals. If we look for the minimum value for both, we see that is at a periodicity of 12.
# Compare Observed and Residuals
plot = metrics.hvplot(
x="period",
xlabel="Period",
ylabel="Ratios",
padding=0.1,
grid=True,
title="Comparison of Trend and Residuals Ratio relative to Observed",
legend="top",
height=400,
)The plot above suggests that the results of the ACF plot for this time series and our interpretation of its periodicity is correct - at least using these metrics.
Conclusion
The post by way of using six example time series plots introduces the utility of the Autocorrelation Function plot in characterising signals. It demonstrated that the ACF plot could also be used to identify seasonal trends and estimate their periodicity.
This was then utilised in two decomposition methods, seasonal_decompose and STL to extract the seasonal component. The residuals were then used to provide a potential metric to determine the effectiveness of the seasonality extraction.
Version History
2025-06-03- Original2025-09-03- Responsive plots and branding update.
Attribution
Images used in this post have been generated using multiple Machine Learning (or Artificial Intelligence) models and subsequently modified by the author.
Where ever possible these have been identified with the following symbol:

References
Citation
@online{miah2025,
author = {Miah, Ashraf},
title = {A {Practical} {Guide} to {Time} {Series} {Seasonality} \&
{Signal} {Decomposition}},
date = {2025-06-03},
url = {https://blog.curiodata.pro/posts/16-time-series-seasonality/},
langid = {en}
}