Log in

Kernel and graph: Two approaches for nonlinear competitive learning clusterin

  • Research Article
  • Published:
Frontiers of Electrical and Electronic Engineering

Abstract

Competitive learning has attracted a significant amount of attention in the past decades in the field of data clustering. In this paper, we will present two works done by our group which address the nonlinearly separable problem suffered by the classical competitive learning clustering algorithms. They are kernel competitive learning (KCL) and graph-based multi-prototype competitive learning (GMPCL), respectively. In KCL, data points are first mapped from the input data space into a high-dimensional kernel space where the nonlinearly separable pattern becomes linear one. Then the classical competitive learning is performed in this kernel space to generate a cluster structure. To realize on-line learning in the kernel space without knowing the explicit kernel map**, we propose a prototype descriptor, each row of which represents a prototype by the inner products between the prototype and data points as well as the squared length of the prototype. In GMPCL, a graph-based method is employed to produce an initial, coarse clustering. After that, a multi-prototype competitive learning is introduced to refine the coarse clustering and discover clusters of an arbitrary shape. In the multi-prototype competitive learning, to generate cluster boundaries of arbitrary shapes, each cluster is represented by multiple prototypes, whose subregions of the Voronoi diagram together approximately characterize one cluster of an arbitrary shape. Moreover, we introduce some extensions of these two approaches with experiments demonstrating their effectiveness.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Rumelhart D E, Zipser D. Feature discovery by competitive learning. Cognitive Science, 1985, 9(1): 75–112

    Article  Google Scholar 

  2. Xu L. A unified perspective and new results on RHT computing, mixture based learning, and multi-learner based problem solving. Pattern Recognition, 2007, 40(8): 2129–2153

    Article  MATH  Google Scholar 

  3. Xu L, Krzyzak A, Oja E. Rival penalized competitive learning for clustering analysis, RBF net, and curve detection. IEEE Transactions on Neural Networks, 1993, 4(4): 636–649

    Article  Google Scholar 

  4. Wang C D, Lai J H. Energy based competitive learning. Neurocomputing, 2011, 74(12–13): 2265–2275

    Article  MathSciNet  Google Scholar 

  5. Hwang W-J, Ye B-Y, Liao S-C. A novel entropy-constrained competitive learning algorithm for vector quantization. Neurocomputing, 1999, 25(1–3): 133–147

    Article  MATH  Google Scholar 

  6. Xu L. RBF nets, mixture experts, and Bayesian Ying-Yang learning. Neurocomputing, 1998, 19(1–3): 223–257

    Article  MATH  Google Scholar 

  7. Liu Z-Y, Chiu K-C, Xu L. Strip line detection and thinning by RPCL-based local PCA. Pattern Recognition Letters, 2003, 24(14): 2335–2344

    Article  MATH  Google Scholar 

  8. Liu Z-Y, Qiao H, Xu L. Multisets mixture learning-based ellipse detection. Pattern Recognition, 2006, 39(4): 731–735

    Article  MATH  Google Scholar 

  9. Cheung Y-M, Xu L. Rival penalized competitive learning based approach for discrete-valued source separation. International Journal of Neural Systems, 2000, 10(6): 483–490

    Google Scholar 

  10. Cheung Y-M, Xu L. An RPCL-based approach for Markov model identification with unknown state number. IEEE Signal Processing Letters, 2000, 7(10): 284–287

    Article  Google Scholar 

  11. Liu Z-Y, Xu L. Topological local principal component analysis. Neurocomputing, 2003, 55(3–4): 739–745

    Article  Google Scholar 

  12. Chen R-M, Huang Y-M. Competitive neural network to solve scheduling problems. Neurocomputing, 2001, 37(1–4): 177–196

    Article  MATH  Google Scholar 

  13. Shawe-Taylor J, Cristianini N. Kernel Methods for Pattern Analysis. Cambridge University Press, 2004

  14. Mizutani K, Miyamoto S. Kernel-based fuzzy competitive learning clustering. In: Proceedings of IEEE International Conference on Fuzzy Systems. 2005, 636–639

  15. Bacciu D, Starita A. Expansive competitive learning for kernel vector quantization. Pattern Recognition Letters, 2009, 30(6): 641–651

    Article  Google Scholar 

  16. Inokuchi R, Miyamoto S. Kernel methods for clustering: Competitive learning and c-means. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 2006, 14(4): 481–493

    Article  MathSciNet  MATH  Google Scholar 

  17. Schölkopf B, Smola A, Müller K-R. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 1998, 10(5): 1299–1319

    Article  Google Scholar 

  18. Tan P-N, Steinbach M, Kumar V. Introduction to Data Mining. Pearson Addison Wesley, 2006

  19. Ertöz L, Steinbach M, Kumar V. Finding clusters of different sizes, shapes, and densities in noisy, high dimensional data. In: Proceedings of SIAM International Conference on Data Mining. 2003, 47–58

  20. Shi J, Malik J. Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(8): 888–905

    Article  Google Scholar 

  21. Bishop C M. Pattern Recognition and Machine Learning. Springer, 2006

  22. DeSieno D. Adding a conscience to competitive learning. In: Proceedings of IEEE International Conference on Neural Networks. 1988, 117–124

  23. Wang C D, Lai J H, Zhu J Y. A conscience on-line learning approach for kernel-based clustering. In: Proceedings of the 10th International Conference on Data Mining. 2010, 531–540

  24. Wang C D, Lai J H, Zhu J Y. Conscience online learning: An efficient approach for robust kernel-based clustering. Knowledge and Information Systems (in press) DOI: 10.1007/s10115-011-0416-2

  25. Hull J J. A database for handwritten text recognition research. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1994, 16(5): 550–554

    Article  Google Scholar 

  26. LeCun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 1998, 86(11): 2278–2324

    Article  Google Scholar 

  27. Asuncion A, Newman D. UCI Machine Learning Repository. 2007, http://www.ics.uci.edu/?mlearn/MLRepository.html

  28. Tzortzis G F, Likas A C. The global kernel k-means algorithm for clustering in feature space. IEEE Transactions on Neural Networks, 2009, 20(7): 1181–1194

    Article  Google Scholar 

  29. Wang C D, Lai J H, Zhu J Y. Graph-based multiprototype competitive learning and its applications. IEEE Transactions on Systems, Man, and Cybernetics — Part C: Applications and Reviews (in press)

  30. Ester M, Kriegel H-P, Sander J, Xu X. A density-based algorithm for discovering clusters in large spatial databases with noise. In: Proceedings of the 2nd International Conference on Knowledge Discovery and Data Mining. 1996, 226–231

  31. Liu M, Jiang X, Kot A C. A multi-prototype clustering algorithm. Pattern Recognition, 2009, 42(5): 689–698

    Article  MATH  Google Scholar 

  32. Frey B J, Dueck D. Clustering by passing messages between data points. Science, 2007, 315(5814): 972–976

    Article  MathSciNet  MATH  Google Scholar 

  33. Truong B T, Venkatesh S. Video abstraction: A systematic review and classification. ACM Transactions on Multimedia Computing, Communications, and Applications, 2007, 3(1): 1–37

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jianhuang Lai.

Additional information

Jianhuang LAI received his M.Sc. degree in applied mathematics in 1989 and his Ph.D. degree in mathematics in 1999 from Sun Yat-sen University, Guangzhou, China. He joined Sun Yat-sen University in 1989 as an Assistant Professor, where currently, he is a Professor with the Department of Automation of School of Information Science and Technology and Vice Dean of School of Information Science and Technology. His current research interests are in the areas of digital image processing, pattern recognition, multimedia communication, wavelet and its applications. He has published over 80 scientific papers in the international journals and conferences on image processing and pattern recognition, e.g., IEEE TNN, IEEE TIP, IEEE TSMC (Part B), Pattern Recognition, ICCV, CVPR, and ICDM. Prof. Lai is chair of the Image and Graphics Association of Guangdong Province and also serves as a standing member of the Image and Graphics Association of China.

Changdong WANG received the B.S. degree in applied mathematics in 2008 and M.Sc. degree in computer science in 2010 from Sun Yat-sen University, Guangzhou, China. He started the pursuit of the Ph.D. degree with Sun Yat-sen University in September 2010. His current research interests include machine learning and data mining, especially focusing on data clustering and its applications. He has published over 10 scientific papers in several international journals and conferences such as Neurocomputing, Knowledge and Information System, IEEE TSMC-C, IEEE TKDE, and IEEE ICDM. His ICDM 2010 paper won the Honorable Mention for Best Research Paper Awards. He won the Student Travel Award from ICDM 2010 and ICDM 2011, respectively.

About this article

Cite this article

Lai, J., Wang, C. Kernel and graph: Two approaches for nonlinear competitive learning clusterin. Front. Electr. Electron. Eng. 7, 134–146 (2012). https://doi.org/10.1007/s11460-012-0159-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11460-012-0159-1

Keywords

Navigation