Relaxed 2-D Principal Component Analysis by Lp Norm for Face Recognition

  • Conference paper
  • First Online:
Intelligent Computing Theories and Application (ICIC 2019)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11643))

Included in the following conference series:

Abstract

A relaxed two-dimensional principal component analysis (R2DPCA) approach is proposed for face recognition. Different to the 2DPCA, 2DPCA-L1 and G2DPCA, the R2DPCA utilizes the label information (if known) of training samples to calculate a relaxation vector and presents a weight to each subset of training data. A new relaxed scatter matrix is defined and the computed projection axes are able to increase the accuracy of face recognition. The optimal Lp-norms are selected in a reasonable range. Numerical experiments on practical face databased indicate that the R2DPCA has high generalization ability and can achieve a higher recognition rate than state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 35.99
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 44.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Jolliffe, I.: Principal Component Analysis. Springer, New York (2004)

    MATH  Google Scholar 

  2. Turk, M., Pentland, A.: Eigenfaces for recognition. J. Cogn. Neurosci. 3(1), 71–86 (1991)

    Article  Google Scholar 

  3. Sirovich, L., Kirby, M.: Low-dimensional procedure for characterization of human faces. J. Opt. Soc. Am. 4, 519–524 (1987)

    Article  Google Scholar 

  4. Kirby, M., Sirovich, L.: Application of the Karhunen-Loeve procedure for the characterization of human faces. IEEE Trans. Pattern Anal. Mach. Intell. 12(1), 103–108 (1990)

    Article  Google Scholar 

  5. Zhao, L., Yang, Y.: Theoretical analysis of illumination in PCA-based vision systems. Pattern Recogn. 32(4), 547–564 (1999)

    Article  Google Scholar 

  6. Pentland, A.: Looking at people: sensing for ubiquitous and wearable computing. IEEE Trans. Pattern Anal. Mach. Intell. 22(1), 107–119 (2000)

    Article  Google Scholar 

  7. Ke, Q., Kanade, T.: Robust L1 norm factorization in the presence of outliters and missing data by alternative convex programming. In: Proceedings IEEE Conference Computer Vision Pattern Recognition, vol. 1, pp. 739–746, San Diego, CA, USA (2005)

    Google Scholar 

  8. Ding, C., Zhou, D., He, X., Zha, H.: R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: Proceedings 23rd International Conference Machine Learning, pp. 281–288, Pittsburgh, PA, USA (2006)

    Google Scholar 

  9. Kwak, N.: Principal component analysis based on L1-norm maximization. IEEE Trans. Pattern Anal. Mach. Intell. 30(9), 1672–1680 (2008)

    Article  MathSciNet  Google Scholar 

  10. Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. J. Comput. Graph. Stat. 15(2), 265–286 (2006)

    Article  MathSciNet  Google Scholar 

  11. d’Aspremont, A., EI Ghaoui, L., Jordan, M.I., Lanckriet, G.R.: A direct formulation for sparse PCA using semidefinite programming. SIAM Rev. 49(3), 434–448 (2007)

    Article  MathSciNet  Google Scholar 

  12. Shen, H., Huang, J.Z.: Sparse principal component analysis via regularized low rank matrix approximation. J. Multivar. Anal. 99(6), 1015–1034 (2008)

    Article  MathSciNet  Google Scholar 

  13. Witten, D.M., Tibshirani, R., Hastie, T.: A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis. Biostatistics 10(3), 515–534 (2009)

    Article  Google Scholar 

  14. Meng, D., Zhao, Q., Xu, Z.: Improve robustness of sparse PCA by L1-norm maximization. Pattern Recogn. 45(1), 487–497 (2012)

    Article  Google Scholar 

  15. Kwak, N.: Principal component analysis by Lp-norm maximization. IEEE Trans. Cybern. 44(5), 594–609 (2014)

    Article  Google Scholar 

  16. Liang, Z., **a, S., Zhou, Y., Zhang, L., Li, Y.: Feature extraction based on Lp-norm generalized principal component analysis. Pattern Recogn. Lett. 34(9), 1037–1045 (2013)

    Article  Google Scholar 

  17. Yang, J., Zhang, D., Frangi, A.F., Yang, J.Y.: Two-dimensional PCA: a new approach to appearance-based face representation and recognition. IEEE Trans. Pattern Anal. Mach. Intell. 26(1), 131–137 (2004)

    Article  Google Scholar 

  18. Li, X., Pang, Y., Yuan, Y.: L1-norm-based 2DPCA. IEEE Trans. Syst. Man Cybern. B Cybern. 40(4), 1170–1175 (2010)

    Article  Google Scholar 

  19. Wang, H., Wang, J.: 2DPCA with L1-norm for simultaneously robust and sparse modelling. Neural Netw. 46, 190–198 (2013)

    Article  Google Scholar 

  20. Wang, J.: Generalized 2-D principal component analysis by Lp-Norm for image analysis. IEEE Trans. Cybern. 46(3), 792–803 (2016)

    Article  Google Scholar 

  21. Jia, Z.-G., Ling, S.-T., Zhao, M.-X.: Color two-dimensional principal component analysis for face recognition based on quaternion model. In: Huang, D.-S., Bevilacqua, V., Premaratne, P., Gupta, P. (eds.) ICIC 2017. LNCS, vol. 10361, pp. 177–189. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-63309-1_17

    Chapter  Google Scholar 

  22. Zhao, M., Jia, Z., Gong, D.: Sample-relaxed two-dimensional color principal component analysis for face recognition and image reconstruction. ar**v.org/cs/ar**v:1803.03837v1 (2018)

  23. Jia, Z., Wei, M., Ling, S.: A new structure-preserving method for quaternion Hermitian eigenvalue problems. J. Comput. Appl. Math. 239, 12–24 (2013)

    Article  MathSciNet  Google Scholar 

  24. Ma, R., Jia, Z., Bai, Z.: A structure-preserving Jacobi algorithm for quaternion Hermitian eigenvalue problems. Comput. Math Appl. 75(3), 809–820 (2018)

    Article  MathSciNet  Google Scholar 

  25. Jia, Z., Ma, R., Zhao, M.: A new structure-preserving method for recognition of color face images. Comput. Sci. Artif. Intell. 427–432 (2017)

    Google Scholar 

  26. Jia, Z., Wei, M., Zhao, M., Chen, Y.: A new real structure-preserving quaternion QR algorithm. J. Comput. Appl. Math. 343, 26–48 (2018)

    Article  MathSciNet  Google Scholar 

  27. Jia, Z., Cheng, X., Zhao, M.: A new method for roots of monic quaternionic quadratic polynomial. Comput. Math Appl. 58(9), 1852–1858 (2009)

    Article  MathSciNet  Google Scholar 

  28. Jia, Z., Wang, Q., Wei, M.: Procrustes problems for (P, Q, η)-reflexive matrices. J. Comput. Appl. Math. 233(11), 3041–3045 (2010)

    Article  MathSciNet  Google Scholar 

  29. Zhao, M., Jia, Z.: Structured least-squares problems and inverse eigenvalue problems for (P, Q)-reflexive matrices. Appl. Math. Comput. 235, 87–93 (2014)

    Article  MathSciNet  Google Scholar 

  30. Jia, Z., Ng, M.K., Song, G.: Lanczos method for large-scale quaternion singular value decomposition. Numer. Algorithms (2018). https://doi.org/10.1007/s11075-018-0621-0

  31. Jia, Z., Ng, M.K., Song, G.: Robust quaternion matrix completion with applications to image inpainting. Numer. Linear Algebra Appl. (2019). https://doi.org/10.1002/nla.2245. http://www.math.hkbu.edu.hk/~mng/quaternion.html

  32. Jia, Z., Ng, M.K., Wang, W.: Color image restoration by saturation-value (SV) total variation. SIAM J. Imaging Sci. (2019). http://www.math.hkbu.edu.hk/~mng/publications.html

  33. Mackey, L.: Deflation methods for sparse PCA. Proceedings Advances in Neural Information Processing Systems 21, pp. 1017–1024, Whistler, BC, Canada(2008)

    Google Scholar 

  34. Ye, J.: Characterization of a family of algorithms for generalized discriminant analysis on undersampled problems. Mach. Learn. Res. 6, 483–502 (2005)

    MathSciNet  MATH  Google Scholar 

  35. Liang, Z.Z., Li, Y.F., Shi, P.F.: A note on two-dimensional linear discriminant analysis. Pattern Recogn. Lett. 29, 2122–2128 (2008)

    Article  Google Scholar 

  36. Chang, Q., Jia, Z.: New fast algorithms for a modified TV-Stokes model. Sci. Sinica Math. 44(12), 1323–1336 (2014). (in Chinese)

    Article  Google Scholar 

  37. Jia, Z., Wei, M.: A new TV-Stokes model for image deblurring and denoising with fast algorithms. J. Sci. Comput. 72(2), 522–541 (2017)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

This paper is supported in part by National Natural Science Foundation of China under grant 11771188.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhigang Jia .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, X., Jia, Z., Cai, Y., Zhao, M. (2019). Relaxed 2-D Principal Component Analysis by Lp Norm for Face Recognition. In: Huang, DS., Bevilacqua, V., Premaratne, P. (eds) Intelligent Computing Theories and Application. ICIC 2019. Lecture Notes in Computer Science(), vol 11643. Springer, Cham. https://doi.org/10.1007/978-3-030-26763-6_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-26763-6_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-26762-9

  • Online ISBN: 978-3-030-26763-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation