A Three-Stage Curriculum Learning Framework with Hierarchical Label Smoothing for Fine-Grained Entity Ty**

  • Conference paper
  • First Online:
Database Systems for Advanced Applications (DASFAA 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13247))

Included in the following conference series:

Abstract

In this paper, we study the noisy labeling problem on the fine-grained entity ty** (FET) task. Most existing methods propose to divide the training data into “clean” and “noisy” sets and use different strategies to deal with them during the training process. However, the “clean” samples used in these methods are not actually clean, some of them also contain noisy labels. To overcome this issue, we propose a three-stage curriculum learning framework with hierarchical label smoothing to train the FET model, which can use relatively clean data to train the model and prevent the model from overfitting to noisy labels. Experiments conducted on three widely used FET datasets show that our method achieves the new state-of-the-art performance. Our code is publicly available at https://github.com/xubodhu/NFETC-CLHLS.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://github.com/billy-inn/NFETC.

References

  1. Abhishek, A., Anand, A., Awekar, A.: Fine-grained entity type classification by jointly learning representations and label embeddings. In: EACL, pp. 797–807 (2017)

    Google Scholar 

  2. Chen, B., et al.: Improving distantly-supervised entity ty** with compact latent space clustering. In: NAACL, pp. 2862–2872 (2019)

    Google Scholar 

  3. Chen, T., Chen, Y., Van Durme, B.: Hierarchical entity ty** via multi-level learning to rank. In: ACL, pp. 8465–8475 (2020)

    Google Scholar 

  4. Ling, X., Weld, D.S.: Fine-grained entity recognition. In: AAAI, pp. 94–100 (2012)

    Google Scholar 

  5. Lukasik, M., Bhojanapalli, S., Menon, A., Kumar, S.: Does label smoothing mitigate label noise? In: ICML, pp. 6448–6458. PMLR (2020)

    Google Scholar 

  6. Ren, X., He, W., Qu, M., Huang, L., Ji, H., Han, J.: AFET: automatic fine-grained entity ty** by hierarchical partial-label embedding. In: EMNLP, pp. 1369–1378 (2016)

    Google Scholar 

  7. Shimaoka, S., Stenetorp, P., Inui, K., Riedel, S.: An attentive neural architecture for fine-grained entity type classification. In: Proceedings of the 5th Workshop on Automated Knowledge Base Construction, pp. 69–74 (2016)

    Google Scholar 

  8. Weischedel, R., Brunstein, A.: BBN pronoun coreference and entity type corpus. Linguistic Data Consortium, Philadelphia 112 (2005)

    Google Scholar 

  9. Weischedel, R., et al.: Ontonotes release 5.0 ldc2013t19. Linguistic Data Consortium, Philadelphia, PA 23 (2013)

    Google Scholar 

  10. Wu, J., Zhang, R., Mao, Y., Guo, H., Huai, J.: Modeling noisy hierarchical types in fine-grained entity ty**: a content-based weighting approach. In: IJCAI, pp. 5264–5270 (2019)

    Google Scholar 

  11. Xu, P., Barbosa, D.: Neural fine-grained entity type classification with hierarchy-aware loss. In: NAACL, pp. 16–25 (2018)

    Google Scholar 

  12. Zhang, H., et al.: Learning with noise: improving distantly-supervised fine-grained entity ty** via automatic relabeling. In: IJCAI, pp. 3808–3815 (2020)

    Google Scholar 

  13. Zhou, T., Wang, S., Bilmes, J.A.: Robust curriculum learning: from clean label detection to noisy label self-correction. In: ICLR, pp. 1–18 (2021)

    Google Scholar 

Download references

Acknowledgement

This paper was supported by the National Natural Science Foundation of China (61906035), Shanghai Sailing Program (19YF1402300), Shanghai Municipal Commission of Economy and Information (202002009) and the Fundamental Research Funds for the Central Universities (2232021A-08).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chaofeng Sha .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xu, B., Zhang, Z., Sha, C., Du, M., Song, H., Wang, H. (2022). A Three-Stage Curriculum Learning Framework with Hierarchical Label Smoothing for Fine-Grained Entity Ty**. In: Bhattacharya, A., et al. Database Systems for Advanced Applications. DASFAA 2022. Lecture Notes in Computer Science, vol 13247. Springer, Cham. https://doi.org/10.1007/978-3-031-00129-1_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-00129-1_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-00128-4

  • Online ISBN: 978-3-031-00129-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation