Introduction:
Time series is an important part of machine learning that is frequently disregarded. A time series is useful in forecasting, stock market analysis, and even forex trading, but its time component makes it a difficult concept to grasp.
A time series is a sequence or series of data points that is arranged in time. A dataset is a collection of observations in traditional machine learning.
It produces predictions based on previously unknown facts, and it forecasts the future while taking all previous observations into account. The dataset in a time series is unique. A time-series introduces a distinct order of dependence between observations.
Time series are supposed to be generated at regular intervals. When the data in a time series is timed and regular, it is called a regular time series; when it is not timed or regular, it is called an irregular time series.
Types of Time Series:
Currently, there are two types of time series, namely deterministic and non-deterministic time series. Let us dive deep below:
1. Deterministic Time Series: A deterministic time series is expressed by an analytic expression. It contains no random or probabilistic elements. The derivative values at that time explicitly specify the past and future in deterministic time series.
2. Non-Deterministic Time Series: This form of time-series means that an analytic expression cannot describe it and has some random aspect, preventing it from describing the behaviour. Given the data, it can be characterised as non-deterministic if one of the two conditions is met. These are the following conditions: if the function generating the data is random, or if there are missing pieces in the dataset.
Time Series Components:
Source: https://quantdare.com/decomposition-to-improve-time-series-prediction/
There are 4 components of Time Series which are shown in the above picture:
1. Trend: A long term general direction of data in upward or downward direction.
2. Cycles: Pattern of highs or lows through which the data moves over time periods usually of more throughout the period.
3. Seasonal Effect: Shorter cycles that are less than one year.
4. Irregular Effect: Rapid changes in the cycles.
Time Series Forecasting Models:
Before understanding time series models let us see what are AR and MA models.
The AR model predictions using a linear mixture of the target’s historical values, whereas a moving-average model is a method for modelling univariate time series.
The MA model stipulates that the output variable is linearly dependent on the current and various historical values of a stochastic factor.
Now we understand what AR and MA models are lets continue our models.
1. ARIMA (Autoregressive Integrated Moving Average): This technique combines the Autoregression and Moving Average models. The steps in ARIMA models are ordered as a linear function. Previous steps show the difference between observation and residual errors. This approach is best suited for univariate time series with trends but no seasonal components.
2. SARIMA (Seasonal Autoregressive Integrated Moving-Average): The next steps in the SARIMA model are in the form of a linear function depending on the preceding time steps’ observations, errors, seasonal differences, and seasonal errors. The SARIMA model is used to fit a univariate time series with trend or seasonal components.
3. Vector Autoregression (VAR): Using AR approaches, this method simulates the next step in each time series. It multiplies parallel time series, such as multivariate time series, using the AR generalisation.
4. Simple Exponential Smoothing (SES): The model in this method employs the next step as an exponentially weighted linear function of prior time steps’ observations. This approach is most effective for univariate time series with no trends or seasonal components.
Applications:
1. A time series model can be used to forecast the stock’s closing price at the end of the day.
2. Time series models can be used to forecast the volume of product sales.
3. Time series models can forecast a hospital’s birth or death rate.