A comparison of learning algorithms for seasonal time series forecasting using NARX model

Hermansah -, Dedi Rosadi, Abdurakhman -, Herni Utami

Abstract


In this study, we propose a nonlinear autoregressive network with exogenous inputs (NARX) model with two deterministic seasonal dummy approaches, that is binary dummy variables and sine-cosine pairs. For significant lag is selected using a stepwise AIC method, including a deterministic seasonal dummy. While the number of neurons in the hidden layer is conducted by trial and error method on one to five neurons. The NARX model is trained using five types of algorithms and a tangent hyperbolic activation function. Each algorithm is compared on all approaches to see the speed of convergence and forecasting accuracy. In addition, time series data is performed using data without and with the first differencing process. The results of the case study show that the best approach to the NARX model is to use binary dummy variables and data with the first differencing process. On the other hand, the GRPROP algorithm shows the least computation time, the fastest training process steps, and forecasting accuracy with the best MAPE value. Overall, the GRPROP algorithm is the best training algorithm in this case. However, the GRPROP algorithm on the variation of its parameters shows it is not stable. While the RPROP algorithm for parameter variations shows a better speed and stability of convergence than backpropagation and GRPROP. The backpropagation algorithm occasionally outperforms GRPROP on its parameter variations.

Full Text: PDF

Published: 2021-08-19

How to Cite this Article:

Hermansah -, Dedi Rosadi, Abdurakhman -, Herni Utami, A comparison of learning algorithms for seasonal time series forecasting using NARX model, J. Math. Comput. Sci., 11 (2021), 6638-6656

Copyright © 2021 Hermansah -, Dedi Rosadi, Abdurakhman -, Herni Utami. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

 

Copyright ©2022 JMCS