Log in

Multiobjective Conjugate Gradient Methods on Riemannian Manifolds

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

In this paper, we present the multiobjective optimization methods of conjugate gradient on Riemannian manifolds. The concepts of optimality and Wolfe conditions, as well as Zoutendijk’s theorem, are redefined in this setting. We show that under some standard assumptions, a sequence generated by these algorithms converges to a critical Pareto point. This is when the step sizes satisfy the multiobjective Wolfe conditions. In particular, we propose the Fletcher–Reeves, Dai–Yuan, Polak–Ribière–Polyak, and Hestenes–Stiefel parameters and further analyze the convergence behavior of the first two methods and test their performance against the steepest descent method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Germany)

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Absil, P.A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)

    Book  MATH  Google Scholar 

  2. Bello Cruz, J.Y., Lucambio Pérez, L.R., Melo, J.G.: Convergence of the projected gradient method for quasiconvex multiobjective optimization. Nonlinear Anal. Theory Methods Appl. 74(16), 5268–5273 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bento, G.C., Ferreira, O.P., Oliveira, P.R.: Unconstrained steepest descent method for multicriteria optimization on Riemannian manifolds. J. Optim. Theory Appl. 154(1), 88–107 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bento, G.C., Neto, J.C.: A subgradient method for multiobjective optimization on Riemannian manifolds. J. Optim. Theory Appl. 159(1), 125–137 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  5. Bento, G.C., Neto, J.C., Meireles, L.V.: Proximal point method for locally Lipschitz functions in multiobjective optimization of Hadamard manifolds. J. Optim. Theory Appl. 179(1), 37–52 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  6. Bento, G.C., Neto, J.C., Santos, P.: An inexact steepest descent method for multicriteria optimization on Riemannian manifolds. J. Optim. Theory Appl. 159, 108–124 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  7. Bonnel, H., Iusem, A.N., Svaiter, B.F.: Proximal methods in vector optimization. SIAM J. Optim. 15, 953–970 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  8. Boumal, N.: An Introduction to Optimization on Smooth Manifolds. Cambridge University Press, Cambridge (2023)

    Book  MATH  Google Scholar 

  9. Cai, T., Song, L., Li, G., Liao, M.: Multi-task learning with Riemannian optimization. In: ICIC 2021: Intelligent Computing Theories and Application. Lecture Notes in Computer Science 12837, 499–509 (2021)

  10. Carrizo, G.A., Lotito, P.A., Maciel, M.C.: Trust-region globalization strategy for the nonconvex unconstrained multiobjective optimization problem. Math. Program. 159, 339–369 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  11. Carrizosa, E., Frenk, J.B.G.: Dominating sets for convex functions with some applications. J. Optim. Theory Appl. 96(2), 281–295 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  12. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  13. Das, I., Dennis, J.E.: A closer look at drawbacks of minimizing weighted sums of objectives for Pareto set generation in multicriteria optimization problems. Struct. Optim. 14, 63–69 (1997)

    Article  Google Scholar 

  14. Das, I., Dennis, J.E.: Normal-boundary intersection: a new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM J. Optim. 8(3), 631–657 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  15. Deb, K.: Multiobjective Optimization using Evolutionary Algorithms. Wiley, New York (2001)

    MATH  Google Scholar 

  16. Eschenauer, H., Koski, J., Osyczka, A.: Multicriteria Design Optimization. Springer, Berlin (1990)

    Book  MATH  Google Scholar 

  17. Eslami, N., Najafi, B., Vaezpour, S.M.: A trust-region method for solving multicriteria optimization problems on Riemannian manifolds. J. Optim. Theory Appl. 196(1), 212–239 (2023)

    Article  MathSciNet  MATH  Google Scholar 

  18. Evans, G.W.: An overview of techniques for solving multiobjective mathematical programs. Manage. Sci. 30(11), 1268–1282 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  19. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  20. Fliege, J.: OLAF—a general modeling system to evaluate and optimize the location of an air polluting facility. OR Spektrum 23, 117–136 (2001)

    Article  MATH  Google Scholar 

  21. Fliege, J., Graña Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20(2), 602–626 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  22. Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  23. Fliege, J., Vicente, L.N.: Multicriteria approach to bilevel optimization. J. Optim. Theory Appl. 131(2), 209–225 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  24. Fukuda, E.H., Graña Drummond, L.M.: On the convergence of the projected gradient method for vector optimization. Optimization 60(89), 1009–1021 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  25. Fukuda, E.H., Graña Drummond, L.M.: Inexact projected gradient method for vector optimization. Comput. Optim. Appl. 54(3), 473–493 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  26. Gass, S., Saaty, T.: The computational algorithm for the parametric objective function. Naval Res. Logist. Quart. 2(1–2), 39–45 (1955)

    Article  MathSciNet  Google Scholar 

  27. Geoffrion, A.M.: Proper efficiency and the theory of vector maximization. J. Math. Anal. Appl. 22(3), 618–630 (1968)

    Article  MathSciNet  MATH  Google Scholar 

  28. Graña Drummond, L.M., Iusem, A.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28, 5–29 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  29. Graña Drummond, L.M., Raupp, F.M.P., Svaiter, B.F.: A quadratically convergent Newton method for vector optimization. Optimization 63(5), 661–677 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  30. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–435 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  31. Lu, F., Chen, C.R.: Newton-like methods for solving vector optimization problems. Appl. Anal. 93(8), 1567–1586 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  32. Lucambio Pérez, L.R., Prudente, L.F.: Nonlinear conjugate gradient methods for vector optimization. SIAM J. Optim. 28(3), 2690–2720 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  33. Polak, E., Ribiere, G.: Note sur la convergence de méthodes de directions conjuguées. Revue française d’informatique et de recherche opérationnelle. Série rouge 3(R1), 35–43 (1969)

  34. Ring, W., Wirth, B.: Optimization methods on Riemannian manifolds and their application to shape space. SIAM J. Optim. 22(2), 596–627 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  35. Sato, H.: Riemannian Optimization and its Applications. Springer, New York (2021)

    Book  MATH  Google Scholar 

  36. Sato, H., Iwai, T.: A new, globally convergent Riemannian conjugate gradient method. Optimization 64(4), 1011–1031 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  37. Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: Empirical results. Evol. Comput. 8(2), 173–195 (2000)

    Article  Google Scholar 

Download references

Acknowledgements

The authors are deeply grateful to the editor and anonymous referees, whose patience and numerous detailed comments greatly enhanced the quality of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Masoud Hajarian.

Additional information

Communicated by Sándor Zoltán Németh.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Najafi, S., Hajarian, M. Multiobjective Conjugate Gradient Methods on Riemannian Manifolds. J Optim Theory Appl 197, 1229–1248 (2023). https://doi.org/10.1007/s10957-023-02224-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-023-02224-1

Keywords

Navigation