Abstract
Most physical processes possess structural properties such as constant energies, volumes, and other invariants over time. When learning models of such dynamical systems, it is critical to respect these invariants to ensure accurate predictions and physically meaningful behavior. Strikingly, state-of-the-art methods in Gaussian process (GP) dynamics model learning are not addressing this issue. On the other hand, classical numerical integrators are specifically designed to preserve these crucial properties through time. We propose to combine the advantages of GPs as function approximators with structure-preserving numerical integrators for dynamical systems, such as Runge-Kutta methods. These integrators assume access to the ground truth dynamics and require evaluations of intermediate and future time steps that are unknown in a learning-based scenario. This makes direct inference of the GP dynamics, with embedded numerical scheme, intractable. As our key technical contribution, we enable inference through the implicitly defined Runge-Kutta transition probability. In a nutshell, we introduce an implicit layer for GP regression, which is embedded into a variational inference model learning scheme.
Code will be published upon request.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Abdulle, A., Garegnani, G.: Random time step probabilistic methods for uncertainty quantification in chaotic and geometric numerical integration. Stat. Comput. 30(4), 907–932 (2020)
Alemi, A., Poole, B., Fischer, I., Dillon, J., Saurous, R.A., Murphy, K.: Fixing a broken ELBO. In: Proceedings of the 35th International Conference on Machine Learning, vol. 80, pp. 159–168, PMLR (2018)
Bai, S., Kolter, J.Z., Koltun, V.: Deep equilibrium models. In: Advances in Neural Information Processing Systems vol. 32, pp. 690–701 (2019)
Brüdigam, J., Schuck, M., Capone, A., Sosnowski, S., Hirche, S.: Structure-preserving learning using Gaussian processes and variational integrators. In: Proceedings of the 4th Conference on Learning for Dynamics and Control, PMLR (2022)
Buisson-Fenet, M., Solowjow, F., Trimpe, S.: Actively learning Gaussian process dynamics. In: Proceedings of the 2nd Conference on Learning for Dynamics and Control, PMLR (2020)
Chen, Z., Zhang, J., Arjovsky, M., Bottou, L.: Symplectic recurrent neural networks. In: 8th International Conference on Learning Representations, ICLR 2020 (2020)
Doerr, A., Daniel, C., Schiegg, M., Nguyen-Tuong, D., Schaal, S., Toussaint, M., Trimpe, S.: Probabilistic recurrent state-space models. In: Proceedings of the International Conference on Machine Learning (ICML) (2018)
Frigola, R., Chen, Y., Rasmussen, C.: Variational Gaussian process state-space models. In: Advances in Neural Information Processing Systems, vol. 27 (NIPS 2014) pp. 3680–3688 (2014)
Geist, A., Trimpe, S.: Learning constrained dynamics with Gauss principle adhering Gaussian processes. In: Proceedings of the 2nd Conference on Learning for Dynamics and Control, pp. 225–234, PMLR (2020)
Gould, S., Hartley, R., Campbell, D.: Deep declarative networks: A new hope. ar**v:1909.04866 (2019)
Greydanus, S., Dzamba, M., Yosinski, J.: Hamiltonian neural networks. In: Advances in Neural Information Processing Systems, vol. 32, pp. 15379–15389 (2019)
Hairer, E., Lubich, C., Wanner, G.: Geometric numerical integration: structure-preserving algorithms for ordinary differential equations. Springer. https://doi.org/10.1007/3-540-30666-8 (2006)
Hairer, E., Nørsett, S., Wanner, G.: Solving Ordinary Differential Equations I - Nonstiff Problems. Springer (1987). https://doi.org/10.1007/978-3-540-78862-1
Hairer, E., Wanner, G.: Solving Ordinary Differential Equations II - Stiff and Differential-Algebraic Problems. Springer (1996). https://doi.org/10.1007/978-3-642-05221-7
Hegde, P., Çaatay Yld, Lähdesmäki, H., Kaski, S., Heinonen, M.: Variational multiple shooting for Bayesian ODEs with Gaussian processes. In: Proceedings of the 38th Uncertainty in Artificial Intelligence Conference, PMLR (2022)
Heinonen, M., Yildiz, C., Mannerström, H., Intosalmi, J., Lähdesmäki, H.: Learning unknown ODE models with Gaussian processes. In: Proceedings of the 35th International Conference on Machine Learning (2018)
Hensman, J., Fusi, N., Lawrence, N.: Gaussian processes for big data. Uncertainty in Artificial Intelligence. In: Proceedings of the 29th Conference, UAI 2013 (2013)
Howarth, R.J.: Mining geostatistics. London & New york (academic press), 1978. Mineralogical Mag. 43, 1–4 (1979)
Ialongo, A.D., Van Der Wilk, M., Hensman, J., Rasmussen, C.E.: Overcoming mean-field approximations in recurrent Gaussian process models. In: Proceedings of the 36th International Conference on Machine Learning (ICML) (2019)
**, P., Zhang, Z., Zhu, A., Tang, Y., Karniadakis, G.E.: Sympnets: Intrinsic structure-preserving symplectic networks for identifying Hamiltonian systems. Neural Netw. 132(C), 166–179 (2020)
Krantz, S., Parks, H.: The implicit function theorem. History, theory, and applications. Reprint of the 2003 hardback edition (2013)
Ljung, L.: System identification. Wiley encyclopedia of electrical and electronics engineering, pp. 1–19 (1999)
Look, A., Doneva, S., Kandemir, M., Gemulla, R., Peters, J.: Differentiable implicit layers. In: Workshop on machine learning for engineering modeling, simulation and design at NeurIPS 2020 (2020)
Rahimi, A., Recht, B.: Random features for large-scale kernel machines. In: Advances in Neural Information Processing Systems, vol. 20 (2008)
Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). The MIT Press (2005)
Rath, K., Albert, C.G., Bischl, B., von Toussaint, U.: Symplectic Gaussian process regression of maps in Hamiltonian systems. Chaos 31, 5 (2021)
Saemundsson, S., Terenin, A., Hofmann, K., Deisenroth, M.P.: Variational integrator networks for physically structured embeddings. In: Proceedings of the 23rd International Conference on Artificial Intelligence and Statistics (AISTATS), vol. 108 (2020)
Sakurai, J.J.: Modern quantum mechanics; rev. ed. Addison-Wesley (1994)
Salmon, R.: Hamiltonian fluid mechanics. Annu. Rev. Fluid Mech. 20, 225–256 (2003)
Tao, M.: Explicit symplectic approximation of nonseparable Hamiltonians: Algorithm and long time performance. Phys. Rev. 94(4), 043303 (2016)
Titsias, M.: Variational learning of inducing variables in sparse Gaussian processes. J. Mach. Learn. Res. Proc. Track, pp. 567–574 (2009)
Turner, R., Deisenroth, M., Rasmussen, C.: State-space inference and learning with Gaussian processes. In: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, Proceedings of Machine Learning Research, vol. 9, pp. 868–875, PMLR (2010)
Wang, J., Fleet, D., Hertzmann, A.: Gaussian process dynamical models for human motion. IEEE Trans. Pattern Anal. Mach. Intell. 30, 283–98 (2008)
Wilson, J., Borovitskiy, V., Terenin, A., Mostowsky, P., Deisenroth, M.: Efficiently sampling functions from Gaussian process posteriors. In: Proceedings of the 37th International Conference on Machine Learning, vol. 119, pp. 10292–10302 (2020)
Zhong, Y.D., Dey, B., Chakraborty, A.: Symplectic ODE-net: Learning Hamiltonian dynamics with control. In: 8th International Conference on Learning Representations, ICLR 2020 (2020)
Acknowledgements
The authors thank Barbara Rakitsch, Alexander von Rohr and Mona Buisson-Fenet for helpful discussions.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Ensinger, K., Solowjow, F., Ziesche, S., Tiemann, M., Trimpe, S. (2023). Structure-Preserving Gaussian Process Dynamics. In: Amini, MR., Canu, S., Fischer, A., Guns, T., Kralj Novak, P., Tsoumakas, G. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2022. Lecture Notes in Computer Science(), vol 13717. Springer, Cham. https://doi.org/10.1007/978-3-031-26419-1_9
Download citation
DOI: https://doi.org/10.1007/978-3-031-26419-1_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-26418-4
Online ISBN: 978-3-031-26419-1
eBook Packages: Computer ScienceComputer Science (R0)