Abstract
In this paper, we study the noisy labeling problem on the fine-grained entity ty** (FET) task. Most existing methods propose to divide the training data into “clean” and “noisy” sets and use different strategies to deal with them during the training process. However, the “clean” samples used in these methods are not actually clean, some of them also contain noisy labels. To overcome this issue, we propose a three-stage curriculum learning framework with hierarchical label smoothing to train the FET model, which can use relatively clean data to train the model and prevent the model from overfitting to noisy labels. Experiments conducted on three widely used FET datasets show that our method achieves the new state-of-the-art performance. Our code is publicly available at https://github.com/xubodhu/NFETC-CLHLS.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Abhishek, A., Anand, A., Awekar, A.: Fine-grained entity type classification by jointly learning representations and label embeddings. In: EACL, pp. 797–807 (2017)
Chen, B., et al.: Improving distantly-supervised entity ty** with compact latent space clustering. In: NAACL, pp. 2862–2872 (2019)
Chen, T., Chen, Y., Van Durme, B.: Hierarchical entity ty** via multi-level learning to rank. In: ACL, pp. 8465–8475 (2020)
Ling, X., Weld, D.S.: Fine-grained entity recognition. In: AAAI, pp. 94–100 (2012)
Lukasik, M., Bhojanapalli, S., Menon, A., Kumar, S.: Does label smoothing mitigate label noise? In: ICML, pp. 6448–6458. PMLR (2020)
Ren, X., He, W., Qu, M., Huang, L., Ji, H., Han, J.: AFET: automatic fine-grained entity ty** by hierarchical partial-label embedding. In: EMNLP, pp. 1369–1378 (2016)
Shimaoka, S., Stenetorp, P., Inui, K., Riedel, S.: An attentive neural architecture for fine-grained entity type classification. In: Proceedings of the 5th Workshop on Automated Knowledge Base Construction, pp. 69–74 (2016)
Weischedel, R., Brunstein, A.: BBN pronoun coreference and entity type corpus. Linguistic Data Consortium, Philadelphia 112 (2005)
Weischedel, R., et al.: Ontonotes release 5.0 ldc2013t19. Linguistic Data Consortium, Philadelphia, PA 23 (2013)
Wu, J., Zhang, R., Mao, Y., Guo, H., Huai, J.: Modeling noisy hierarchical types in fine-grained entity ty**: a content-based weighting approach. In: IJCAI, pp. 5264–5270 (2019)
Xu, P., Barbosa, D.: Neural fine-grained entity type classification with hierarchy-aware loss. In: NAACL, pp. 16–25 (2018)
Zhang, H., et al.: Learning with noise: improving distantly-supervised fine-grained entity ty** via automatic relabeling. In: IJCAI, pp. 3808–3815 (2020)
Zhou, T., Wang, S., Bilmes, J.A.: Robust curriculum learning: from clean label detection to noisy label self-correction. In: ICLR, pp. 1–18 (2021)
Acknowledgement
This paper was supported by the National Natural Science Foundation of China (61906035), Shanghai Sailing Program (19YF1402300), Shanghai Municipal Commission of Economy and Information (202002009) and the Fundamental Research Funds for the Central Universities (2232021A-08).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Xu, B., Zhang, Z., Sha, C., Du, M., Song, H., Wang, H. (2022). A Three-Stage Curriculum Learning Framework with Hierarchical Label Smoothing for Fine-Grained Entity Ty**. In: Bhattacharya, A., et al. Database Systems for Advanced Applications. DASFAA 2022. Lecture Notes in Computer Science, vol 13247. Springer, Cham. https://doi.org/10.1007/978-3-031-00129-1_23
Download citation
DOI: https://doi.org/10.1007/978-3-031-00129-1_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-00128-4
Online ISBN: 978-3-031-00129-1
eBook Packages: Computer ScienceComputer Science (R0)