Handling Concept Drift in Global Time Series Forecasting

  • Chapter
  • First Online:
Forecasting with Artificial Intelligence

Abstract

Machine learning (ML) based time series forecasting models often require and assume certain degrees of stationarity in the data when producing forecasts. However, in many real-world situations, the data distributions are not stationary and they can change over time while reducing the accuracy of the forecasting models, which in the ML literature is known as concept drift. Handling concept drift in forecasting is essential for many ML methods in use nowadays, however, the prior work only proposes methods to handle concept drift in the classification domain. To fill this gap, we explore concept drift handling methods in particular for Global Forecasting Models (GFM) which recently have gained popularity in the forecasting domain. We propose two new concept drift handling methods, namely Error Contribution Weighting (ECW) and Gradient Descent Weighting (GDW), based on a continuous adaptive weighting concept. These methods use two forecasting models which are separately trained with the most recent series and all series, and finally, the weighted average of the forecasts provided by the two models is considered as the final forecasts. Using LightGBM as the underlying base learner, in our evaluation on three simulated datasets, the proposed models achieve significantly higher accuracy than a set of statistical benchmarks and LightGBM baselines across four evaluation metrics.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now
Chapter
EUR 29.95
Price includes VAT (Germany)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 128.39
Price includes VAT (Germany)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
EUR 171.19
Price includes VAT (Germany)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  • Baier, L., Hofmann, M., Kühl, N., Mohr, M., & Satzger, G. (2020). Handling concept drifts in regression problems–the error intersection approach. In 15th International Conference on Wirtschaftsinformatik.

    Google Scholar 

  • Bandara, K., Bergmeir, C., & Smyl, S. (2020). Forecasting across time series databases using recurrent neural networks on groups of similar series: A clustering approach. Expert Systems with Applications, 140, 112896.

    Article  Google Scholar 

  • Canseco, M. M., & Garza, F. (2022). Nixtla: Transfer learning for time series forecasting. https://github.com/Nixtla/transfer-learning-time-series

  • Cerqueira, V., Torgo, L., & Mozetič, I. (2020). Evaluating time series forecasting models: An empirical study on performance estimation methods. Machine Learning, 109(11), 1997–2028.

    Article  Google Scholar 

  • Challu, C., Olivares, K. G., Oreshkin, B. N., Garza, F., Canseco, M. M., & Dubrawski, A. (2023). N-HiTS: Neural hierarchical interpolation for time series forecasting. In AAAI Conference on Artificial Intelligence (Vol. 37).

    Google Scholar 

  • Chu, F., & Zaniolo, C. (2004). Fast and light boosting for adaptive mining of data streams. In Pacific-Asia Conference on Knowledge Discovery and Data Mining (pp. 282–292). Springer.

    Google Scholar 

  • Dawid, A. P. (1984). Present position and potential developments: Some personal views statistical theory the prequential approach. Journal of the Royal Statistical Society: Series A (General), 147(2), 278–290.

    Article  Google Scholar 

  • Delany, S. J., Cunningham, P., Tsymbal, A., & Coyle, L. (2004). A case-based technique for tracking concept drift in spam filtering. In International Conference on Innovative Techniques and Applications of Artificial Intelligence (pp. 3–16). Springer.

    Google Scholar 

  • García, S., Fernández, A., Luengo, J., & Herrera, F. (2010). Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Information Sciences, 180(10), 2044–2064.

    Article  Google Scholar 

  • Garza, F., Canseco, M. M., Challú, C., & Olivares, K. G. (2022). StatsForecast: Lightning fast forecasting with statistical and econometric models. PyCon Salt Lake City, Utah, US.https://github.com/Nixtla/statsforecast

  • Ghomeshi, H., Gaber, M. M., & Kovalchuk, Y. (2019). EACD: Evolutionary adaptation to concept drifts in data streams. Data Mining and Knowledge Discovery, 33(3), 663–694.

    Article  Google Scholar 

  • Godahewa, R., Bandara, K., Webb, G. I., Smyl, S., & Bergmeir, C. (2021). Ensembles of localised models for time series forecasting. Knowledge-Based Systems, 233, 107518.

    Article  Google Scholar 

  • Gomes, H. M., Barddal, J. P., Enembreck, F., & Bifet, A. (2017). A survey on ensemble learning for data stream classification. ACM Computing Surveys (CSUR), 50(2), 1–36.

    Article  Google Scholar 

  • Gomes, H. M., & Enembreck, F. (2013). SAE: Social adaptive ensemble classifier for data streams. In IEEE Symposium on Computational Intelligence and Data Mining (pp. 199–206).

    Google Scholar 

  • Gonçalves Jr, P. M., & De Barros, R. S. S. (2013). RCD: A recurring concept drift framework. Pattern Recognition Letters, 34(9), 1018–1025.

    Article  Google Scholar 

  • Grazzi, R., Flunkert, V., Salinas, D., Januschowski, T., Seeger, M., & Archambeau, C. (2021). Meta-forecasting by combining global deep representations with local adaptation. https://arxiv.org/abs/2111.03418

  • Hewamalage, H. Bergmeir, C., & Bandara, K. (2020). Recurrent neural networks for time series forecasting: Current status and future directions. International Journal of Forecasting.

    Google Scholar 

  • Hyndman, R. J., & Athanasopoulos, G. (2018). Forecasting: Principles and practice (2nd ed.). OTexts.

    Google Scholar 

  • Hyndman, R. J., Koehler, A. B., Ord, J. K., & Snyder, R. D. (2008). Forecasting with exponential smoothing: The state space approach. Springer Science and Business Media.

    Google Scholar 

  • Januschowski, T., Gasthaus, J., Wang, Y., Salinas, D., Flunkert, V., Bohlke-Schneider, M., & Callot, L. (2020). Criteria for classifying forecasting methods. International Journal of Forecasting, 36(1), 167–177.

    Article  Google Scholar 

  • Januschowski, T., Wang, Y., Torkkola, K., Erkkilä, T., Hasson, H., & Gasthaus, J.(2021). Forecasting with trees. International Journal of Forecasting.

    Google Scholar 

  • Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., & Liu, T. (2017). LightGBM: A highly efficient gradient boosting decision tree. In Proceedings of the 31st International Conference on Neural Information Processing Systems, NIPS’17 (pp. 3149–3157). Curran Associates Inc.

    Google Scholar 

  • Kolter, J. Z., & Maloof, M. A. (2007). Dynamic weighted majority: An ensemble method for drifting concepts. The Journal of Machine Learning Research, 8, 2755–2790.

    Google Scholar 

  • Krawczyk, B., Minku, L., Gama, J., Stefanowski, J., & Woźniak, M. (2017). Ensemble learning for data stream analysis: A survey. Information Fusion, 37, 132–156.

    Article  Google Scholar 

  • Makridakis, S., Spiliotis, E., & Assimakopoulos, V. (2018). The M4 competition: Results, findings, conclusion and way forward. International Journal of Forecasting, 34(4), 802–808.

    Article  Google Scholar 

  • Makridakis, S., Spiliotis, E., & Assimakopoulos, V. (2022). The M5 accuracy competition: Results, findings and conclusions. International Journal of Forecasting, 38(4), 1346–1364.

    Article  Google Scholar 

  • Mean Absolute Error. (2010). Encyclopedia of machine learning (C. Sammut & G. I. Webb, Eds., p. 652). Springer.

    Google Scholar 

  • Montero-Manso, P., & Hyndman, R. J. (2021). Principles and algorithms for forecasting groups of time series: Locality and globality. International Journal of Forecasting. ISSN 0169-2070.

    Google Scholar 

  • Oreshkin, B. N., Carpov, D., Chapados, N., & Bengio, Y. (2020). N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. In 8th International Conference on Learning Representations (ICLR).

    Google Scholar 

  • Oreshkin, B. N., Carpov, D., Chapados, N., & Bengio, Y. (2021). Meta-learning framework with applications to zero-shot time-series forecasting. In AAAI Conference on Artificial Intelligence (Vol. 35).

    Google Scholar 

  • Salinas, D., Flunkert, V., Gasthaus, J., & Januschowski, T. (2020). DeepAR: Probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 36(3), 1181–1191.

    Article  Google Scholar 

  • Smyl, S. (2020). A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting. International Journal of Forecasting, 36(1), 75–85.

    Article  Google Scholar 

  • Tashman, L. J. (2000). Out-of-sample tests of forecasting accuracy: An analysis and review. International Journal of Forecasting, 16(4), 437–450.

    Article  Google Scholar 

  • Webb, G. I., Hyde, R., Cao, H., Nguyen, H. L., & Petitjean, F. (2016). Characterizing concept drift. Data Mining and Knowledge Discovery, 30(4), 964–994.

    Article  Google Scholar 

  • Widmer, G., & Kubat, M. (1996). Learning in the presence of concept drift and hidden contexts. Machine Learning, 23(1), 69–101.

    Article  Google Scholar 

  • Woo, G., Liu, C., Sahoo, D., Kumar, A., & Hoi, S. (2022). Deeptime: Deep time-index meta-learning for non-stationary time-series forecasting. https://arxiv.org/abs/2207.06046

  • Ye, R., & Dai, Q. (2018). A novel transfer learning framework for time series forecasting. Knowledge-Based Systems, 156, 74–99.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christoph Bergmeir .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Liu, Z., Godahewa, R., Bandara, K., Bergmeir, C. (2023). Handling Concept Drift in Global Time Series Forecasting. In: Hamoudia, M., Makridakis, S., Spiliotis, E. (eds) Forecasting with Artificial Intelligence. Palgrave Advances in the Economics of Innovation and Technology. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-031-35879-1_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35879-1_7

  • Published:

  • Publisher Name: Palgrave Macmillan, Cham

  • Print ISBN: 978-3-031-35878-4

  • Online ISBN: 978-3-031-35879-1

  • eBook Packages: Economics and FinanceEconomics and Finance (R0)

Publish with us

Policies and ethics

Navigation