Abstract
Problem of learning a graphical model (graphical model selection problem) consists of recovering a conditional dependence structure (concentration graph) from data given as a sample of observations from a random vector. Various algorithms to solve this problem are known. One class of algorithms is related with convex optimization problem with additional lasso regularization term. Such algorithms are called graphical lasso algorithms. Various properties and practical efficiency of graphical lasso algorithms were investigated in the literature. In the present paper we study sensitivity of uncertainty (level of error) of graphical lasso algorithms to the change of distribution of the random vector. This issue is not well studied yet. First, we show that uncertainty of the classical version of graphical lasso algorithm is very sensitive to the change of distribution. Next, we suggest simple modifications of this algorithm which are much more robust in the large class of distributions. Finally, we discuss a future development of the proposed approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intel., (6), 721–741 (1984)
Manning, C.D., Schütze, H.: Foundations of Statistical Natural Language Processing. MIT Press, Cambridge (1999)
Durbin, R., Eddy, S.R., Krogh, A., Mitchison, G.: Biological Sequence Analysis: Probabilistic Models of Proteins and Nucleic Acids. Cambridge University Press, Cambridge (1998)
Liu, J., Peissig, P., Zhang, C., Burnside, E., McCarty, C., Page, D.: Graphical-model based multiple testing under dependence, with applications to genome-wide association studies. In: Uncertainty in artificial intelligence: proceedings of the Conference on Uncertainty in Artificial Intelligence, vol. 2012, p. 511, NIH Public Access (2012)
Zhou, L., Wang, L., Liu, L., Ogunbona, P., Dinggang, S.: Learning discriminative Bayesian networks from high-dimensional continuous neuroimaging data. IEEE Trans. Pattern Anal. Mach. Intell. 38(11), 2269–2283 (2016)
Drton, M., Maathuis, M.H.: Structure learning in graphical modeling. Ann. Rev. Stat. Appl. 4, 365–393 (2017)
Cordoba, I., Bielza, C., Larranaga, P.: A review of Gaussian Markov models for conditional independence. J. Stat. Plann. Infer. 206, 127–144 (2020)
Friedman, J., Hastie, T., Tibshirani, R.: Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9(3), 432–441 (2008)
Seal, S., Li, Q., Basner, E.B., Saba, L.M., Kechris, K.: RCFGL: rapid condition adaptive fused graphical lasso and application to modeling brain region co-expression networks. PLoS Comput. Biol. 19(1), e1010758 (2023)
Gottard, A., Pacillo, S.: Robust concentration graph model selection. Comput. Stat. Data Anal. 54(12), 3070–3079 (2010)
Anderson, T.W.: An Introduction to Multivariate Statistical Analysis, 3rd edn. Wiley Interscience, New York (2003)
Kalyagin, V.A., Koldanov, A.P., Koldanov, P.A., Pardalos, P.M.: Statistical Analysis of Graph Structures in Random Variable Networks. SO, Springer, Cham (2020). https://doi.org/10.1007/978-3-030-60293-2
Kalyagin, V., Koldanov, A., Koldanov, P.: Robust identification in random variable networks. J. Stat. Plann. Infer. 181, 30–40 (2017)
Cisneros-Velarde, P., Petersen, A., Oh, S.-Y.: Distributionally robust formulation and model selection for the graphical lasso. In: Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics (AISTATS), PMLR, vol. 108, pp. 756–765 (2020)
Chicco, D., Jurman, G.: The advantages of the Matthews correlation coefficient (MCC) over F1 score and accuracy in binary classification evaluation. BMC Genomics 21, 6 (2020)
Peng, J., Wang, P., Zhou, N., Zhu, J.: Partial correlation estimation by joint sparse regression models. J. Am. Stat. Assoc. 104(486), 735–746 (2009)
Kalyagin, V., Kostylev, I.: Graph density and uncertainty of graphical model selection algorithms, Commun. Comput. Inf. Sci. 1913, 188–201, Springer Cham (2023). https://doi.org/10.1007/978-3-031-48751-4_14
Kostenetskiy, P., Chulkevich, R., Kozyrev, V.: HPC resources of the higher school of economics. J. Phys. Conf. Ser. 1740, 012050 (2021)
Acknowledgements
Sections 2 and 3 with general problem setting were prepared within the framework of the Basic Research Program at the National Research University Higher School of Economics (HSE University), results of the Sects. 4, 5 and 6 about graphical lasso algorithms are obtained with a support from RSF grant 22-11-00073. Numerical experiments were conducted using HSE HPC resources [18].
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Kalyagin, V., Kostylev, I. (2024). Robustness of Graphical Lasso Optimization Algorithm for Learning a Graphical Model. In: Eremeev, A., Khachay, M., Kochetov, Y., Mazalov, V., Pardalos, P. (eds) Mathematical Optimization Theory and Operations Research. MOTOR 2024. Lecture Notes in Computer Science, vol 14766. Springer, Cham. https://doi.org/10.1007/978-3-031-62792-7_23
Download citation
DOI: https://doi.org/10.1007/978-3-031-62792-7_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-62791-0
Online ISBN: 978-3-031-62792-7
eBook Packages: Computer ScienceComputer Science (R0)