Log in

Operational Absolutely Optimal Dynamic Control of the Stochastic Differential Plant’s State by Its Output

  • CONTROL IN STOCHASTIC SYSTEMS AND UNDER UNCERTAINTY
  • Published:
Journal of Computer and Systems Sciences International Aims and scope

Abstract

The problem of synthesizing the average-optimal control law for a dynamic plant subject to random disturbances, if its state variables are measured partially or with random errors, is considered. Using the method of a posteriori sufficient coordinates (SCs), the complexity of constructing the well-known interval-optimal Mortensen controller is described and a much simpler algorithm for finding its operational-optimal analog is obtained. The new controller does not require the solution of the corresponding Bellman equation in inverse time, since it is optimal in the sense of a time-varying criterion. This makes it possible to disregard information about the future behavior of the object and reduces the procedure for finding the dependence of a control on sufficient coordinates to direct-time integration of the Fokker–Planck–Kolmogorov equation and to solving a problem of parametric nonlinear programming. The application of the obtained algorithm is demonstrated by the example of a linear-quadratic-Gaussian problem, as a result of which a new operational version of the well-known separation theorem is formulated. It represents a stochastic control device as a combination of a linear Kalman–Bucy filter and a linear operational-optimal positional controller. The latter differs from the traditional interval-optimal controller by the well-known gain and does not require the solution of the corresponding matrix Riccati equation in inverse time.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Germany)

Instant access to the full article PDF.

Similar content being viewed by others

REFERENCES

  1. R. L. Stratonovich, “Toward the theory of optimal control. Sufficient coordinates,” Automation and Remote Control 23 (7), 910–917 (1962).

  2. R. E. Mortensen, “Stochastic optimal control with noisy observations,” Int. J. Control 4 (5), 455–466 (1966).

    Article  MathSciNet  Google Scholar 

  3. M. H. A. Davis and P. P. Varaiya, “Dynamic programming conditions for partially observable stochastic systems,” SIAM J. Control 11 (2), 226–262 (1973).

    Article  MathSciNet  Google Scholar 

  4. Yu. I. Paraev, Introduction to Statistical Dynamics of Control and Filtering Processes (Sov. radio, Moscow, 1976) [in Russian].

  5. V. E. Benes and I. Karatzas, “On the relation of Zakai’s and Mortensen’s equations,” SIAM J. Control Optim. 21 (3), 472–489 (1983).

    Article  MathSciNet  Google Scholar 

  6. A. Bensoussan, Stochastic Control of Partially Observable Systems (Cambridge Univ. Press, Cambridge, 1992).

    Book  Google Scholar 

  7. E. A. Rudenko, “Operational-optimal finite-dimensional dynamic controller of the stochastic differential plant’s state according to its output: I. General nonlinear case,” J. Comput. Syst. Sci. Int. 61 (5), 724–740 (2022).

    Article  MathSciNet  Google Scholar 

  8. W. M. Wonham, “On the separation theorem of stochastic control,” SIAM J. Control 6 (2), 312–326 (1968).

    Article  MathSciNet  Google Scholar 

  9. V. S. Verba, V. I. Merkulov, and E. A. Rudenko, “Linear-cubic locally optimal control of linear systems and its application for aircraft guidance,” J. Comput. Syst. Sci. Int. 59 (5), 768–780 (2020).

    Article  Google Scholar 

  10. V. S. Pugachev and I. N. Sinitsyn, Stochastic Differential Systems: Analysis and Filtering (Nauka, Moscow, 1985) [in Russian].

    Google Scholar 

  11. I. N. Sinitsyn, Kalman and Pugachev Filters (Logos, Moscow, 2007) [in Russian].

    Google Scholar 

  12. A. V. Panteleev, E. A. Rudenko, and A. S. Bortakovskii, Nonlinear Control Systems: Description, Analysis, and Synthesis (Vuzovskaya kniga, Moscow, 2008) [in Russian].

  13. E. A. Rudenko, “Optimal structure of continuous nonlinear reduced-order Pugachev filter,” J. Comput. Syst. Sci. Int. 52 (6), 866–902 (2013).

    Article  MathSciNet  Google Scholar 

  14. A. N. Shiryaev, Probability (Nauka, Moscow, 1980) [in Russian].

    Google Scholar 

  15. K. Brammer and G. Siffling, Kalman-Bucy-Filter, Deterministische Beobachtung und Stochastische Filterung (R. Oldenbourg, München, 1975; Nauka, Moscow, 1982).

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to E. A. Rudenko.

Ethics declarations

The author declares that he has no conflicts of interest.

Additional information

Publisher’s Note.

Pleiades Publishing remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rudenko, E.A. Operational Absolutely Optimal Dynamic Control of the Stochastic Differential Plant’s State by Its Output. J. Comput. Syst. Sci. Int. 62, 233–247 (2023). https://doi.org/10.1134/S1064230723020168

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S1064230723020168

Navigation