Abstract
Estimation of high-dimensional covariance structure is an interesting topic in statistics. Motivated by the work of Lin et al. [9], in this paper, the quadratic loss function is proposed to measure the discrepancy between a real covariance matrix and its candidate covariance matrix, where the latter has a regular structure. A commonly encountered candidate structures including MA(1), compound symmetry, AR(1), and banded Toeplitz matrix are considered. Regularization is made by selecting the optimal structure from a potential class of candidate covariance structures through minimizing the discrepancy, i.e., the quadratic loss function, between the given matrix and the candidate covariance class. Analytical or numerical solutions to the optimization problems are obtained and simulation studies are also conducted, showing that the proposed approach provides a reliable method to regularize covariance structures. It is applied to analyze real data problems for illustration of the use of the proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Cui, X., Li, C., Zhao, J., Zeng, L., Zhang, D., Pan, J.: Covariance structure regularization via Frobenius-norm discrepancy. Linear Algebra Appl. 510, 124–145 (2016)
Ellis, G.H., Watson, L.T.: A parallel algorithm for simple roots for polynomials. Comput. Math. Appl. 10, 107–121 (1984)
Filipiak, K., Markiewicz, A., Mieldzioc, A., Sawikowska, A.: On projection of a positive definite matrix on a cone of nonnegative definite Toeplitz matrices. Electron. J. Linear Algebra 33, 74–82 (2018)
Haff, L.R.: Empirical Bayes estimation of the multivariate normal covariance matrix. Ann. Stat. 8(3), 586–597 (1980)
Huang, C., Farewell, D., Pan, J.: A calibration method for non-positive definite covariance matrix in multivariate data analysis. J. Multivar. Anal. 157, 45–52 (2017)
James, W., Stein, C.: Estimation with quadratic loss. In: Neyman, J. (ed.) Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 361–379. University of California Press, Berkeley (1961)
Kenward, M.: A method for comparing profiles of repeated measurements. Appl. Stat. 36, 296–308 (1987)
Lin, F., Jovanović, M.R.: Least-squares approximation of structured covariances. IEEE Trans. Autom. Control 54(7), 1643–1648 (2009)
Lin, L., Higham, N.J., Pan, J.: Covariance structure regularization via entropy loss function. Comput. Stat. Data Anal. 72(4), 315–327 (2014)
Muirhead, R.J.: Aspects of Multivariate Statistical Theory. Wiley, New York (1982)
Ning, L., Jiang, X, Georgiou, T.: Geometric methods for structured covariance estimation. In: American Control Conference, pp. 1877–1882. IEEE (2012)
Olkin, I., Selliah, J.B.: Estimating covariance matrix in a multivariate normal distribution. In: Gupta, S.S., Moore, D.S (eds.) Statistical Decision Theory and Related Topics, vol. II, pp. 313–326. Academic Press, New York (1977)
Pan, J., Mackenzie, G.: On modelling mean-covariance structures in longitudinal studies. Biometrika 90(1), 239–244 (2003)
Pourahmadi, M.: Joint mean-covariance models with applications to longitudinal data: unconstrained parameterisation. Biometrika 86(3), 677–690 (1999)
Potthoff, R.F., Roy, S.N.: A generalized multivariate analysis of variance model useful especially for growth curve problems. Biometrika 51, 313–326 (1964)
Ye, H., Pan, J.: Modelling of covariance structures in generalised estimating equations for longitudinal data. Biometrika 93(4), 927–941 (2006)
Acknowledgements
We would like to thank the Editor and one anonymous reviewer for their helpful comments and suggestions, which leads to substantial improvements to the paper. This work is partially supported by the Natural Science Foundations of China (11761028), the Reserve Talents Foundation of Yunnan Province (No.2015HB061), and the Reserve Talents Foundations of Honghe University (2014HB0204).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix
Appendix
We need the following lemma to judge the determinant sign of a matrix.
Lemma 4.1
If \(\mathbf {A}\) and \(\mathbf {B}\) are positive semidefinite matrices of the same order, then
and
A1. The proof of \(\mathrm{det}(\nabla ^2 L(c,\sigma ))>0\) for MA(1) case.
We first note the first order partial derivative for \(L(c,\sigma )\) is
Then the Hessian matrix is
where \(x={\text {Tr}}({\pmb {\Sigma }}^{-2}{} \mathbf{T} _1)\), \(y={\text {Tr}}({\pmb {\Sigma }}^{-2}{} \mathbf{T} _1^2)\), \(z={\text {Tr}}({\pmb {\Sigma }}^{-1}{} \mathbf{T} _1)\). Thus
According to Lemma 4.1, we have \(\mathrm{det}(\nabla ^2 L(c,\sigma ))>0\).
A2. The proof of \(\mathrm{det}(\nabla ^2 L(c,\sigma ))>0\) for CS case.
Since
where \(u={\text {Tr}}({\pmb {\Sigma }}^{-1}(\mathbf{J} _m-\mathbf{I} _m))\), \(v={\text {Tr}}({\pmb {\Sigma }}^{-2}(\mathbf{J} _m-\mathbf{I} _m)^2)\), \(w={\text {Tr}}({\pmb {\Sigma }}^{-2}(\mathbf{J} _m-\mathbf{I} _m))\). Then, following Lemma 4.1, we have
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Zhang, D., Cui, X., Li, C., Zhao, J., Zeng, L., Pan, J. (2021). Regularized Estimation of Covariance Structure Through Quadratic Loss Function. In: Filipiak, K., Markiewicz, A., von Rosen, D. (eds) Multivariate, Multilinear and Mixed Linear Models. Contributions to Statistics. Springer, Cham. https://doi.org/10.1007/978-3-030-75494-5_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-75494-5_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-75493-8
Online ISBN: 978-3-030-75494-5
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)