Skip to ContentGo to accessibility pageKeyboard shortcuts menu
OpenStax Logo

additive decomposition
time series decomposition into the sum of its components
augmented Dickey-Fuller (ADF) test
statistical test for stationarity of a time series
autocorrelation
measure of correlation between a time series and a shifted copy of itself
autoregressive (AR) model
model or component of a model that captures how the time series depends on its own previous values
cyclic component
large variations in the data that recur over longer time periods than seasonal fluctuations, having no fixed frequency
detrending
one of two complementary operations that separate the trend component from a time series
differencing
found by taking differences of consecutive terms, that is, xn+1xnxn+1xn
error
extent to which predictions differ from actual observations
exponential moving average (EMA)
type of weighted moving average in which the most recent values are given larger weights and the weights decay exponentially the further back in time the values are in the series
forecasting
making predictions about future, unknown values of a time series
integrative (I) component
component of the ARIMA model that represents the differencing operation
lag
number of time steps that a time series is shifted for the purpose of computing autocorrelation
level
the mean of all the time series data
mean absolute error (MAE)
measure of error; MAE=1ni=1n|εi|MAE=1ni=1n|εi|
mean absolute percentage error (MAPE)
measure of relative error; MAPE=1ni=1n|εixi|MAPE=1ni=1n|εixi|
measures of error (or measures of fit)
the metrics used to assess how well a model's predictions align with observed data
multiplicative decomposition
time series decomposition into the product of its components
naïve (or flat) forecasting method
using only the last observed value of a time series to predicting the next term
noise (or random error)
random, unpredictable variations that occur in the time series that cannot be attributed to the trend, cycles, or seasonal components
normalized (weights)
weights or coefficients that add up to a total of 1
order (or degree)
the number of terms or lags used in a model to describe the time series; for a sequence that is modeled by a polynomial formula, the value of the exponent on the term of highest order of the polynomial formula that models the sequence
peak
when computing ACF, a correlation value that is significantly larger than the previous values
period
smallest time interval over which a seasonal variation pattern repeats
prediction interval
range of values the variable could take with some level of probability in a forecast
residuals
the difference between the given time series and the model (xnx^n)(xnx^n) quantifying the error of the model, or noise (random variation) of the time series
root mean squared error (RMSE)
measure of error; RMSE=1ni=1n(εi2)RMSE=1ni=1n(εi2)
scale-dependent
regarding a measure or error that is in direct proportion with the data
seasonal component (seasonality)
variation in the data that occurs at fixed time periods
Seasonal-Trend decomposition using LOESS (STL)
a powerful tool that decomposes a time series into the trend-cycle, seasonal, and noise components
sequence
ordered list of numbers, measurements, or other data
simple moving average (SMA)
average (arithmetic mean) of a fixed number of consecutive data points in a time series
stationary
characterizing a time series in which the variance is relatively constant over time, an overall upward or downward trend cannot be found, and no seasonal patterns exist
symmetric mean absolute percentage error (sMAPE)
measure of relative error; sMAPE=1ni=1n2|εi||xi|+|x^i|sMAPE=1ni=1n2|εi||xi|+|x^i|
terms
in mathematics, the individual values of a sequence
time series
data that has a time component, or an ordered sequence of data points
time series analysis
the examination of data points collected at specific time intervals, enabling the identification of trends, patterns, and seasonal variations crucial for making informed predictions and decisions
time series model
function, algorithm, or method for finding, approximating, or predicting the values of a given time series
trend
long-term direction of time series data in the absence of any other variation
trend curve (or trendline)
line or curve that models the trend of a time series
trend-cycle component
component of a time series that combines the trend and long-term cycles
volatility
displaying significant fluctuations from the mean, typically due to factors that are difficult to analyze or predict
weighted moving average (WMA)
moving average in which terms are given weights according to some formula or rule
white noise
time series data that has constant mean value close to zero, constant variance, and no correlation from one part of the series to another
window
term used to describe a fixed number of consecutive terms of a time series
Citation/Attribution

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution-NonCommercial-ShareAlike License and you must attribute OpenStax.

Attribution information
  • If you are redistributing all or part of this book in a print format, then you must include on every physical page the following attribution:
    Access for free at https://openstax.org/books/principles-data-science/pages/1-introduction
  • If you are redistributing all or part of this book in a digital format, then you must include on every digital page view the following attribution:
    Access for free at https://openstax.org/books/principles-data-science/pages/1-introduction
Citation information

© Dec 19, 2024 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.