Structure-Preserving Gaussian Process Dynamics

  • Conference paper
  • First Online:
Machine Learning and Knowledge Discovery in Databases (ECML PKDD 2022)

Abstract

Most physical processes possess structural properties such as constant energies, volumes, and other invariants over time. When learning models of such dynamical systems, it is critical to respect these invariants to ensure accurate predictions and physically meaningful behavior. Strikingly, state-of-the-art methods in Gaussian process (GP) dynamics model learning are not addressing this issue. On the other hand, classical numerical integrators are specifically designed to preserve these crucial properties through time. We propose to combine the advantages of GPs as function approximators with structure-preserving numerical integrators for dynamical systems, such as Runge-Kutta methods. These integrators assume access to the ground truth dynamics and require evaluations of intermediate and future time steps that are unknown in a learning-based scenario. This makes direct inference of the GP dynamics, with embedded numerical scheme, intractable. As our key technical contribution, we enable inference through the implicitly defined Runge-Kutta transition probability. In a nutshell, we introduce an implicit layer for GP regression, which is embedded into a variational inference model learning scheme.

Code will be published upon request.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (Canada)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (Canada)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (Canada)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Abdulle, A., Garegnani, G.: Random time step probabilistic methods for uncertainty quantification in chaotic and geometric numerical integration. Stat. Comput. 30(4), 907–932 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  2. Alemi, A., Poole, B., Fischer, I., Dillon, J., Saurous, R.A., Murphy, K.: Fixing a broken ELBO. In: Proceedings of the 35th International Conference on Machine Learning, vol. 80, pp. 159–168, PMLR (2018)

    Google Scholar 

  3. Bai, S., Kolter, J.Z., Koltun, V.: Deep equilibrium models. In: Advances in Neural Information Processing Systems vol. 32, pp. 690–701 (2019)

    Google Scholar 

  4. Brüdigam, J., Schuck, M., Capone, A., Sosnowski, S., Hirche, S.: Structure-preserving learning using Gaussian processes and variational integrators. In: Proceedings of the 4th Conference on Learning for Dynamics and Control, PMLR (2022)

    Google Scholar 

  5. Buisson-Fenet, M., Solowjow, F., Trimpe, S.: Actively learning Gaussian process dynamics. In: Proceedings of the 2nd Conference on Learning for Dynamics and Control, PMLR (2020)

    Google Scholar 

  6. Chen, Z., Zhang, J., Arjovsky, M., Bottou, L.: Symplectic recurrent neural networks. In: 8th International Conference on Learning Representations, ICLR 2020 (2020)

    Google Scholar 

  7. Doerr, A., Daniel, C., Schiegg, M., Nguyen-Tuong, D., Schaal, S., Toussaint, M., Trimpe, S.: Probabilistic recurrent state-space models. In: Proceedings of the International Conference on Machine Learning (ICML) (2018)

    Google Scholar 

  8. Frigola, R., Chen, Y., Rasmussen, C.: Variational Gaussian process state-space models. In: Advances in Neural Information Processing Systems, vol. 27 (NIPS 2014) pp. 3680–3688 (2014)

    Google Scholar 

  9. Geist, A., Trimpe, S.: Learning constrained dynamics with Gauss principle adhering Gaussian processes. In: Proceedings of the 2nd Conference on Learning for Dynamics and Control, pp. 225–234, PMLR (2020)

    Google Scholar 

  10. Gould, S., Hartley, R., Campbell, D.: Deep declarative networks: A new hope. ar**v:1909.04866 (2019)

  11. Greydanus, S., Dzamba, M., Yosinski, J.: Hamiltonian neural networks. In: Advances in Neural Information Processing Systems, vol. 32, pp. 15379–15389 (2019)

    Google Scholar 

  12. Hairer, E., Lubich, C., Wanner, G.: Geometric numerical integration: structure-preserving algorithms for ordinary differential equations. Springer. https://doi.org/10.1007/3-540-30666-8 (2006)

  13. Hairer, E., Nørsett, S., Wanner, G.: Solving Ordinary Differential Equations I - Nonstiff Problems. Springer (1987). https://doi.org/10.1007/978-3-540-78862-1

  14. Hairer, E., Wanner, G.: Solving Ordinary Differential Equations II - Stiff and Differential-Algebraic Problems. Springer (1996). https://doi.org/10.1007/978-3-642-05221-7

  15. Hegde, P., Çaatay Yld, Lähdesmäki, H., Kaski, S., Heinonen, M.: Variational multiple shooting for Bayesian ODEs with Gaussian processes. In: Proceedings of the 38th Uncertainty in Artificial Intelligence Conference, PMLR (2022)

    Google Scholar 

  16. Heinonen, M., Yildiz, C., Mannerström, H., Intosalmi, J., Lähdesmäki, H.: Learning unknown ODE models with Gaussian processes. In: Proceedings of the 35th International Conference on Machine Learning (2018)

    Google Scholar 

  17. Hensman, J., Fusi, N., Lawrence, N.: Gaussian processes for big data. Uncertainty in Artificial Intelligence. In: Proceedings of the 29th Conference, UAI 2013 (2013)

    Google Scholar 

  18. Howarth, R.J.: Mining geostatistics. London & New york (academic press), 1978. Mineralogical Mag. 43, 1–4 (1979)

    Google Scholar 

  19. Ialongo, A.D., Van Der Wilk, M., Hensman, J., Rasmussen, C.E.: Overcoming mean-field approximations in recurrent Gaussian process models. In: Proceedings of the 36th International Conference on Machine Learning (ICML) (2019)

    Google Scholar 

  20. **, P., Zhang, Z., Zhu, A., Tang, Y., Karniadakis, G.E.: Sympnets: Intrinsic structure-preserving symplectic networks for identifying Hamiltonian systems. Neural Netw. 132(C), 166–179 (2020)

    Google Scholar 

  21. Krantz, S., Parks, H.: The implicit function theorem. History, theory, and applications. Reprint of the 2003 hardback edition (2013)

    Google Scholar 

  22. Ljung, L.: System identification. Wiley encyclopedia of electrical and electronics engineering, pp. 1–19 (1999)

    Google Scholar 

  23. Look, A., Doneva, S., Kandemir, M., Gemulla, R., Peters, J.: Differentiable implicit layers. In: Workshop on machine learning for engineering modeling, simulation and design at NeurIPS 2020 (2020)

    Google Scholar 

  24. Rahimi, A., Recht, B.: Random features for large-scale kernel machines. In: Advances in Neural Information Processing Systems, vol. 20 (2008)

    Google Scholar 

  25. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). The MIT Press (2005)

    Google Scholar 

  26. Rath, K., Albert, C.G., Bischl, B., von Toussaint, U.: Symplectic Gaussian process regression of maps in Hamiltonian systems. Chaos 31, 5 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  27. Saemundsson, S., Terenin, A., Hofmann, K., Deisenroth, M.P.: Variational integrator networks for physically structured embeddings. In: Proceedings of the 23rd International Conference on Artificial Intelligence and Statistics (AISTATS), vol. 108 (2020)

    Google Scholar 

  28. Sakurai, J.J.: Modern quantum mechanics; rev. ed. Addison-Wesley (1994)

    Google Scholar 

  29. Salmon, R.: Hamiltonian fluid mechanics. Annu. Rev. Fluid Mech. 20, 225–256 (2003)

    Article  Google Scholar 

  30. Tao, M.: Explicit symplectic approximation of nonseparable Hamiltonians: Algorithm and long time performance. Phys. Rev. 94(4), 043303 (2016)

    Google Scholar 

  31. Titsias, M.: Variational learning of inducing variables in sparse Gaussian processes. J. Mach. Learn. Res. Proc. Track, pp. 567–574 (2009)

    Google Scholar 

  32. Turner, R., Deisenroth, M., Rasmussen, C.: State-space inference and learning with Gaussian processes. In: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, Proceedings of Machine Learning Research, vol. 9, pp. 868–875, PMLR (2010)

    Google Scholar 

  33. Wang, J., Fleet, D., Hertzmann, A.: Gaussian process dynamical models for human motion. IEEE Trans. Pattern Anal. Mach. Intell. 30, 283–98 (2008)

    Article  Google Scholar 

  34. Wilson, J., Borovitskiy, V., Terenin, A., Mostowsky, P., Deisenroth, M.: Efficiently sampling functions from Gaussian process posteriors. In: Proceedings of the 37th International Conference on Machine Learning, vol. 119, pp. 10292–10302 (2020)

    Google Scholar 

  35. Zhong, Y.D., Dey, B., Chakraborty, A.: Symplectic ODE-net: Learning Hamiltonian dynamics with control. In: 8th International Conference on Learning Representations, ICLR 2020 (2020)

    Google Scholar 

Download references

Acknowledgements

The authors thank Barbara Rakitsch, Alexander von Rohr and Mona Buisson-Fenet for helpful discussions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Katharina Ensinger .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 1078 KB)

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ensinger, K., Solowjow, F., Ziesche, S., Tiemann, M., Trimpe, S. (2023). Structure-Preserving Gaussian Process Dynamics. In: Amini, MR., Canu, S., Fischer, A., Guns, T., Kralj Novak, P., Tsoumakas, G. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2022. Lecture Notes in Computer Science(), vol 13717. Springer, Cham. https://doi.org/10.1007/978-3-031-26419-1_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-26419-1_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-26418-4

  • Online ISBN: 978-3-031-26419-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation