Abstract
As a class of nonlinear subspace clustering methods, kernel subspace clustering has shown promising performance in many applications. This paper focuses on the kernel selection problem in the kernel subspace clustering model. Currently, the kernel function is typically chosen by the single kernel or multiple kernel methods. The former relies on a given kernel function, which poses challenges in clustering tasks with limited prior information, making it difficult to determine a suitable kernel function beforehand. Multiple kernel methods usually assume that the optimal kernel is near a series of predefined base kernels, which limits the expressive ability of the optimal kernel. Furthermore, multiple kernel methods tend to have higher solution complexity than single kernel methods. To address these limitations, this paper utilizes contrastive learning to learn the optimal kernel adaptively and proposes the Contrastive Kernel Subspace Clustering (CKSC) method. Unlike multiple kernel approaches, CKSC is not constrained by the multiple kernel assumption. Specifically, CKSC integrates a contrastive regularization into the kernel subspace clustering model, encouraging neighboring samples in the original space to stay nearby in the reproducing kernel Hilbert space (RKHS). In this way, the resulting kernel map** can preserve the cluster structure of the data, which will benefit downstream clustering tasks. The clustering experiments on seven benchmark data sets validate the effectiveness of the proposed CKSC method.
This work was partially supported by the National Key Research and Development Program of China (No. 2018AAA0100204), a key program of fundamental research from Shenzhen Science and Technology Innovation Commission (No. JCYJ20200109113403826), the Major Key Project of PCL (No. 2022ZD0115301), and an Open Research Project of Zhejiang Lab (NO.2022RC0AB04).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Abdolali, M., Gillis, N.: Beyond linear subspace clustering: A comparative study of nonlinear manifold clustering algorithms. Comput. Sci. Rev. 42, 100435 (2021)
Bu, F.: A high-order clustering algorithm based on dropout deep learning for heterogeneous data in cyber-physical-social systems. IEEE Access 6, 11687–11693 (2018)
Cai, D., He, X., Wang, X., Bao, H., Han, J.: Locality preserving nonnegative matrix factorization. In: IJCAI, pp. 1010–1015 (2009)
Chen, T., Kornblith, S., Norouzi, M., Hinton, G.E.: A simple framework for contrastive learning of visual representations. In: ICML. Proceedings of Machine Learning Research, vol. 119, pp. 1597–1607. PMLR (2020)
Chopra, S., Hadsell, R., LeCun, Y.: Learning a similarity metric discriminatively, with application to face verification. In: CVPR (1), pp. 539–546. IEEE Computer Society (2005)
Dosovitskiy, A., Springenberg, J.T., Riedmiller, M.A., Brox, T.: Discriminative unsupervised feature learning with convolutional neural networks. In: NIPS, pp. 766–774 (2014)
Du, L., et al.: Robust multiple kernel k-means using l21-norm. In: IJCAI, pp. 3476–3482. AAAI Press (2015)
Elhamifar, E., Vidal, R.: Sparse subspace clustering: algorithm, theory, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 35(11), 2765–2781 (2013)
Gutmann, M., Hyvärinen, A.: Noise-contrastive estimation: a new estimation principle for unnormalized statistical models. In: AISTATS. JMLR Proceedings, vol. 9, pp. 297–304. JMLR.org (2010)
Huang, K., Aviyente, S.: Sparse representation for signal classification. In: NIPS, pp. 609–616. MIT Press (2006)
Jalali, A., Chen, Y., Sanghavi, S., Xu, H.: Clustering partially observed graphs via convex optimization. In: ICML, pp. 1001–1008. Omnipress (2011)
Kang, Z., Lu, X., Lu, Y., Peng, C., Chen, W., Xu, Z.: Structure learning with similarity preserving. Neural Netw. 129, 138–148 (2020)
Kang, Z., Lu, X., Yi, J., Xu, Z.: Self-weighted multiple kernel learning for graph-based clustering and semi-supervised classification. In: IJCAI, pp. 2312–2318. ijcai.org (2018)
Kang, Z., Lu, Y., Su, Y., Li, C., Xu, Z.: Similarity learning via kernel preserving embedding. In: AAAI, pp. 4057–4064. AAAI Press (2019)
Kang, Z., Peng, C., Cheng, Q.: Twin learning for similarity and clustering: A unified kernel approach. In: AAAI, pp. 2080–2086. AAAI Press (2017)
Kang, Z., et al.: Structured graph learning for clustering and semi-supervised classification. Pattern Recogn. 110, 107627 (2021)
Li, Y., Hu, P., Liu, J.Z., Peng, D., Zhou, J.T., Peng, X.: Contrastive clustering. In: AAAI, pp. 8547–8555. AAAI Press (2021)
Liu, G., Lin, Z., Yan, S., Sun, J., Yu, Y., Ma, Y.: Robust recovery of subspace structures by low-rank representation. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 171–184 (2013)
Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. In: ICLR (Poster). OpenReview.net (2019)
Lu, C.-Y., Min, H., Zhao, Z.-Q., Zhu, L., Huang, D.-S., Yan, S.: Robust and efficient subspace segmentation via least squares regression. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012. LNCS, vol. 7578, pp. 347–360. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33786-4_26
von Luxburg, U.: A tutorial on spectral clustering. Stat. Comput. 17(4), 395–416 (2007)
Lv, S., Wei, L., Zhang, Q., Liu, B., Xu, Z.: Improved inference for imputation-based semisupervised learning under misspecified setting. IEEE Trans. Neural Netw. Learn. Syst. 33(11), 6346–6359 (2022)
Pan, Y., Yu, H.: Biomimetic hybrid feedback feedforward neural-network learning control. IEEE Trans. Neural Netw. Learn. Syst. 28(6), 1481–1487 (2017)
Patel, V.M., Vidal, R.: Kernel sparse subspace clustering. In: ICIP, pp. 2849–2853. IEEE (2014)
Peng, B., Zhu, W.: Deep structural contrastive subspace clustering. In: ACML. Proceedings of Machine Learning Research, vol. 157, pp. 1145–1160. PMLR (2021)
Peng, X., Yi, Z., Tang, H.: Robust subspace clustering via thresholding ridge regression. In: AAAI, pp. 3827–3833. AAAI Press (2015)
Ren, Z., Lei, H., Sun, Q., Yang, C.: Simultaneous learning coefficient matrix and affinity graph for multiple kernel clustering. Inf. Sci. 547, 289–306 (2021)
Ren, Z., Li, H., Yang, C., Sun, Q.: Multiple kernel subspace clustering with local structural graph and low-rank consensus kernel learning. Knowl. Based Syst. 188 (2020)
Shi, X., Guo, Z., **ng, F., Cai, J., Yang, L.: Self-learning for face clustering. Pattern Recognit. 79, 279–289 (2018)
Wang, L., Huang, J., Yin, M., Cai, R., Hao, Z.: Block diagonal representation learning for robust subspace clustering. Inf. Sci. 526, 54–67 (2020)
**ao, S., Tan, M., Xu, D.: weighted block-sparse low rank representation for face clustering in videos. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8694, pp. 123–138. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10599-4_9
**ao, S., Tan, M., Xu, D., Dong, Z.Y.: Robust kernel low-rank representation. IEEE Trans. Neural Netw. Learn. Syst. 27(11), 2268–2281 (2016)
Xu, Z., **, R., King, I., Lyu, M.R.: An extended level method for efficient multiple kernel learning. In: Advances in Neural Information Processing Systems 21, pp. 1825–1832. Curran Associates, Inc. (2008)
Xu, Z., **, R., Yang, H., King, I., Lyu, M.R.: Simple and efficient multiple kernel learning by group lasso. In: Fürnkranz, J., Joachims, T. (eds.) Proceedings of the 27th International Conference on Machine Learning (ICML 2010), 21–24 June 2010, Haifa, Israel, pp. 1175–1182. Omnipress (2010)
Yang, C., Ren, Z., Sun, Q., Wu, M., Yin, M., Sun, Y.: Joint correntropy metric weighting and block diagonal regularizer for robust multiple kernel subspace clustering. Inf. Sci. 500, 48–66 (2019)
Zhang, D., et al.: Supporting clustering with contrastive learning. In: NAACL-HLT, pp. 5419–5430. Association for Computational Linguistics (2021)
Zhang, L., Yang, M., Feng, X.: Sparse representation or collaborative representation: Which helps face recognition? In: ICCV, pp. 471–478. IEEE Computer Society (2011)
Zhen, L., Peng, D., Wang, W., Yao, X.: Kernel truncated regression representation for robust subspace clustering. Inf. Sci. 524, 59–76 (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zhang, Q., Kang, Z., Xu, Z., Fu, H. (2024). Contrastive Kernel Subspace Clustering. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Lecture Notes in Computer Science, vol 14451. Springer, Singapore. https://doi.org/10.1007/978-981-99-8073-4_31
Download citation
DOI: https://doi.org/10.1007/978-981-99-8073-4_31
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-8072-7
Online ISBN: 978-981-99-8073-4
eBook Packages: Computer ScienceComputer Science (R0)