Abstract
Machine learning (ML) based time series forecasting models often require and assume certain degrees of stationarity in the data when producing forecasts. However, in many real-world situations, the data distributions are not stationary and they can change over time while reducing the accuracy of the forecasting models, which in the ML literature is known as concept drift. Handling concept drift in forecasting is essential for many ML methods in use nowadays, however, the prior work only proposes methods to handle concept drift in the classification domain. To fill this gap, we explore concept drift handling methods in particular for Global Forecasting Models (GFM) which recently have gained popularity in the forecasting domain. We propose two new concept drift handling methods, namely Error Contribution Weighting (ECW) and Gradient Descent Weighting (GDW), based on a continuous adaptive weighting concept. These methods use two forecasting models which are separately trained with the most recent series and all series, and finally, the weighted average of the forecasts provided by the two models is considered as the final forecasts. Using LightGBM as the underlying base learner, in our evaluation on three simulated datasets, the proposed models achieve significantly higher accuracy than a set of statistical benchmarks and LightGBM baselines across four evaluation metrics.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Baier, L., Hofmann, M., Kühl, N., Mohr, M., & Satzger, G. (2020). Handling concept drifts in regression problems–the error intersection approach. In 15th International Conference on Wirtschaftsinformatik.
Bandara, K., Bergmeir, C., & Smyl, S. (2020). Forecasting across time series databases using recurrent neural networks on groups of similar series: A clustering approach. Expert Systems with Applications, 140, 112896.
Canseco, M. M., & Garza, F. (2022). Nixtla: Transfer learning for time series forecasting. https://github.com/Nixtla/transfer-learning-time-series
Cerqueira, V., Torgo, L., & Mozetič, I. (2020). Evaluating time series forecasting models: An empirical study on performance estimation methods. Machine Learning, 109(11), 1997–2028.
Challu, C., Olivares, K. G., Oreshkin, B. N., Garza, F., Canseco, M. M., & Dubrawski, A. (2023). N-HiTS: Neural hierarchical interpolation for time series forecasting. In AAAI Conference on Artificial Intelligence (Vol. 37).
Chu, F., & Zaniolo, C. (2004). Fast and light boosting for adaptive mining of data streams. In Pacific-Asia Conference on Knowledge Discovery and Data Mining (pp. 282–292). Springer.
Dawid, A. P. (1984). Present position and potential developments: Some personal views statistical theory the prequential approach. Journal of the Royal Statistical Society: Series A (General), 147(2), 278–290.
Delany, S. J., Cunningham, P., Tsymbal, A., & Coyle, L. (2004). A case-based technique for tracking concept drift in spam filtering. In International Conference on Innovative Techniques and Applications of Artificial Intelligence (pp. 3–16). Springer.
García, S., Fernández, A., Luengo, J., & Herrera, F. (2010). Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Information Sciences, 180(10), 2044–2064.
Garza, F., Canseco, M. M., Challú, C., & Olivares, K. G. (2022). StatsForecast: Lightning fast forecasting with statistical and econometric models. PyCon Salt Lake City, Utah, US.https://github.com/Nixtla/statsforecast
Ghomeshi, H., Gaber, M. M., & Kovalchuk, Y. (2019). EACD: Evolutionary adaptation to concept drifts in data streams. Data Mining and Knowledge Discovery, 33(3), 663–694.
Godahewa, R., Bandara, K., Webb, G. I., Smyl, S., & Bergmeir, C. (2021). Ensembles of localised models for time series forecasting. Knowledge-Based Systems, 233, 107518.
Gomes, H. M., Barddal, J. P., Enembreck, F., & Bifet, A. (2017). A survey on ensemble learning for data stream classification. ACM Computing Surveys (CSUR), 50(2), 1–36.
Gomes, H. M., & Enembreck, F. (2013). SAE: Social adaptive ensemble classifier for data streams. In IEEE Symposium on Computational Intelligence and Data Mining (pp. 199–206).
Gonçalves Jr, P. M., & De Barros, R. S. S. (2013). RCD: A recurring concept drift framework. Pattern Recognition Letters, 34(9), 1018–1025.
Grazzi, R., Flunkert, V., Salinas, D., Januschowski, T., Seeger, M., & Archambeau, C. (2021). Meta-forecasting by combining global deep representations with local adaptation. https://arxiv.org/abs/2111.03418
Hewamalage, H. Bergmeir, C., & Bandara, K. (2020). Recurrent neural networks for time series forecasting: Current status and future directions. International Journal of Forecasting.
Hyndman, R. J., & Athanasopoulos, G. (2018). Forecasting: Principles and practice (2nd ed.). OTexts.
Hyndman, R. J., Koehler, A. B., Ord, J. K., & Snyder, R. D. (2008). Forecasting with exponential smoothing: The state space approach. Springer Science and Business Media.
Januschowski, T., Gasthaus, J., Wang, Y., Salinas, D., Flunkert, V., Bohlke-Schneider, M., & Callot, L. (2020). Criteria for classifying forecasting methods. International Journal of Forecasting, 36(1), 167–177.
Januschowski, T., Wang, Y., Torkkola, K., Erkkilä, T., Hasson, H., & Gasthaus, J.(2021). Forecasting with trees. International Journal of Forecasting.
Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., & Liu, T. (2017). LightGBM: A highly efficient gradient boosting decision tree. In Proceedings of the 31st International Conference on Neural Information Processing Systems, NIPS’17 (pp. 3149–3157). Curran Associates Inc.
Kolter, J. Z., & Maloof, M. A. (2007). Dynamic weighted majority: An ensemble method for drifting concepts. The Journal of Machine Learning Research, 8, 2755–2790.
Krawczyk, B., Minku, L., Gama, J., Stefanowski, J., & Woźniak, M. (2017). Ensemble learning for data stream analysis: A survey. Information Fusion, 37, 132–156.
Makridakis, S., Spiliotis, E., & Assimakopoulos, V. (2018). The M4 competition: Results, findings, conclusion and way forward. International Journal of Forecasting, 34(4), 802–808.
Makridakis, S., Spiliotis, E., & Assimakopoulos, V. (2022). The M5 accuracy competition: Results, findings and conclusions. International Journal of Forecasting, 38(4), 1346–1364.
Mean Absolute Error. (2010). Encyclopedia of machine learning (C. Sammut & G. I. Webb, Eds., p. 652). Springer.
Montero-Manso, P., & Hyndman, R. J. (2021). Principles and algorithms for forecasting groups of time series: Locality and globality. International Journal of Forecasting. ISSN 0169-2070.
Oreshkin, B. N., Carpov, D., Chapados, N., & Bengio, Y. (2020). N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. In 8th International Conference on Learning Representations (ICLR).
Oreshkin, B. N., Carpov, D., Chapados, N., & Bengio, Y. (2021). Meta-learning framework with applications to zero-shot time-series forecasting. In AAAI Conference on Artificial Intelligence (Vol. 35).
Salinas, D., Flunkert, V., Gasthaus, J., & Januschowski, T. (2020). DeepAR: Probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 36(3), 1181–1191.
Smyl, S. (2020). A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting. International Journal of Forecasting, 36(1), 75–85.
Tashman, L. J. (2000). Out-of-sample tests of forecasting accuracy: An analysis and review. International Journal of Forecasting, 16(4), 437–450.
Webb, G. I., Hyde, R., Cao, H., Nguyen, H. L., & Petitjean, F. (2016). Characterizing concept drift. Data Mining and Knowledge Discovery, 30(4), 964–994.
Widmer, G., & Kubat, M. (1996). Learning in the presence of concept drift and hidden contexts. Machine Learning, 23(1), 69–101.
Woo, G., Liu, C., Sahoo, D., Kumar, A., & Hoi, S. (2022). Deeptime: Deep time-index meta-learning for non-stationary time-series forecasting. https://arxiv.org/abs/2207.06046
Ye, R., & Dai, Q. (2018). A novel transfer learning framework for time series forecasting. Knowledge-Based Systems, 156, 74–99.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Liu, Z., Godahewa, R., Bandara, K., Bergmeir, C. (2023). Handling Concept Drift in Global Time Series Forecasting. In: Hamoudia, M., Makridakis, S., Spiliotis, E. (eds) Forecasting with Artificial Intelligence. Palgrave Advances in the Economics of Innovation and Technology. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-031-35879-1_7
Download citation
DOI: https://doi.org/10.1007/978-3-031-35879-1_7
Published:
Publisher Name: Palgrave Macmillan, Cham
Print ISBN: 978-3-031-35878-4
Online ISBN: 978-3-031-35879-1
eBook Packages: Economics and FinanceEconomics and Finance (R0)