Super-Gaussian Mixture Source Model for ICA

  • Conference paper
Independent Component Analysis and Blind Signal Separation (ICA 2006)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 3889))

Abstract

We propose an extension of the mixture of factor (or independent component) analyzers model to include strongly super-gaussian mixture source densities. This allows greater economy in representation of densities with (multiple) peaked modes or heavy tails than using several Gaussians to represent these features. We derive an EM algorithm to find the maximum likelihood estimate of the model, and show that it converges globally to a local optimum of the actual non-gaussian mixture model without needing any approximations. This extends considerably the class of source densities that can be used in exact estimation, and shows that in a sense super-gaussian densities are as natural as Gaussian densities. We also derive an adaptive Generalized Gaussian algorithm that learns the shape parameters of Generalized Gaussian mixture components. Experiments verify the validity of the algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
EUR 29.95
Price includes VAT (Germany)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 85.59
Price includes VAT (Germany)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 106.99
Price includes VAT (Germany)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Amari, S.-I.: Natural gradient works efficiently in learning. Neural Computation 10(2), 251–276 (1998)

    Article  MathSciNet  Google Scholar 

  2. Attias, H.: Independent factor analysis. Neural Computation 11, 803–851 (1999)

    Article  Google Scholar 

  3. Attias, H.: A variational Bayesian framework for graphical models. In: Advances in Neural Information Processing Systems 12. MIT Press, Cambridge (2000)

    Google Scholar 

  4. Benveniste, A., Goursat, M., Ruget, G.: Robust identification of a nonminimum phase system. IEEE Transactions on Automatic Control 25(3), 385–399 (1980)

    Article  MATH  MathSciNet  Google Scholar 

  5. Chan, K., Lee, T.-W., Sejnowski, T.J.: Variational learning of clusters of undercomplete nonsymmetric independent components. Journal of Machine Learning Research 3, 99–114 (2002)

    Article  MathSciNet  Google Scholar 

  6. Choudrey, R.A., Roberts, S.J.: Variational mixture of Bayesian independent component analysers. Neural Computation 15(1), 213–252 (2002)

    Article  Google Scholar 

  7. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B 39, 1–38 (1977)

    MATH  MathSciNet  Google Scholar 

  8. Ghahramani, Z., Beal, M.J.: Variational inference for Bayesian mixtures of factor analysers. In: Advances in Neural Information Processing Systems 12. MIT Press, Cambridge (2000)

    Google Scholar 

  9. Girolami, M.: A variational method for learning sparse and overcomplete representations. Neural Computation 13, 2517–2532 (2001)

    Article  MATH  Google Scholar 

  10. Jaakkola, T.S.: Variational Methods for Inference and Estimation in Graphical Models. PhD thesis, Massachusetts Institute of Technology (1997)

    Google Scholar 

  11. Jaakkola, T.S., Jordan, M.I.: A variational approach to Bayesian logistic regression models and their extensions. In: Proceedings of the 1997 Conference on Artificial Intelligence and Statistics (1997)

    Google Scholar 

  12. Lee, T.-W., Lewicki, M.S., Sejnowski, T.J.: ICA mixture models for unsupervised classification of non-gaussian classes and automatic context switching in blind signal separation. IEEE Trans. Pattern Analysis and Machine Intelligence 22(10), 1078–1089 (2000)

    Article  Google Scholar 

  13. Mackay, D.J.C.: Comparison of approximate methods for handling hyperparameters. Neural Computation 11(5), 1035–1068 (1999)

    Article  Google Scholar 

  14. Neal, R.M., Hinton, G.E.: A view of the EM algorithm that justifies incremental, sparse, and other variants. In: Jordan, M.I. (ed.) Learning in Graphical Models, pp. 355–368. Kluwer, Dordrecht (1998)

    Google Scholar 

  15. Palmer, J.A., Kreutz-Delgado, K., Wipf, D.P., Rao, B.D.: Variational EM algorithms for non-gaussian latent variable models. In: Advances in Neural Information Processing Systems. MIT Press, Cambridge (2005), Available at, http://dsp.ucsd.edu/~japalmer/

    Google Scholar 

  16. Park, H.-J., Lee, T.-W.: Modeling nonlinear dependencies in natural images using mixture of Laplacian distribution. In: Saul, L.K., Weiss, Y., Bottou, L. (eds.) Advances in Neural Information Processing Systems 14. MIT Press, Cambridge (2004)

    Google Scholar 

  17. Pearlmutter, B.A., Parra, L.C.: Maximum likelihood blind source separation: A contextsensitive generalization of ICA. In: Mozer, M., Jordan, M.I., Petsche, T. (eds.) Advances in Neural Information Processing Systems. MIT Press, Cambridge (1996)

    Google Scholar 

  18. Saul, L.K., Jaakkola, T.S., Jordan, M.I.: Mean field theory for sigmoid belief networks. Journal of Artificial Intelligence Research 4, 61–76 (1996)

    MATH  Google Scholar 

  19. Tip**, M.E.: Sparse Bayesian learning and the Relevance Vector Machine. Journal of Machine Learning Research 1, 211–244 (2001)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Palmer, J.A., Kreutz-Delgado, K., Makeig, S. (2006). Super-Gaussian Mixture Source Model for ICA. In: Rosca, J., Erdogmus, D., Príncipe, J.C., Haykin, S. (eds) Independent Component Analysis and Blind Signal Separation. ICA 2006. Lecture Notes in Computer Science, vol 3889. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11679363_106

Download citation

  • DOI: https://doi.org/10.1007/11679363_106

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-32630-4

  • Online ISBN: 978-3-540-32631-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation