Chinese Medical Named Entity Recognition Based on Label Knowledge Enhancement

  • Conference paper
  • First Online:
Service Science (ICSS 2023)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1844))

Included in the following conference series:

  • 360 Accesses

Abstract

To solve the problem of text complexity in Chinese medical text named entity task, a model BERT-Label-Span based on label knowledge enhancement had been proposed, which can improve the accuracy of medical information system in Chinese medical named entity recognition task. The model decomposes the problem-text joint coding into two independent coding modules and performs the joint coding based on the BERT pre-training model. Then the semantic fusion module based on the attention mechanism explicitly uses the label knowledge to enhance the representation of medical text. Finally, the span of the named entity is predicted based on the heuristic matching principle. The experiment on the CCKS2019 Chinese medical text dataset shows that the F1 value of the named entity recognition task of the model reaches 85.554, which is higher than the existing main-stream methods and proves the effectiveness of this method.

Supported by National Key R &D Program of China (NO. 2022YFC2503305) and National Science Youth Foundation of Shandong Province of China (NO. ZR2020QF018).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kim, E., Rubinstein, S.M., Nead, K.T., et al.: The evolving use of electronic health records (EHR) for research. In: Seminars in Radiation Oncology. WB Saunders, vol. 29, no. 4, pp. 354–361 (2019)

    Google Scholar 

  2. Sekine, S., Nobata, C.: Definition, dictionaries and tagger for extended named entity hierarchy. In: LREC, pp. 1977–1980 (2004)

    Google Scholar 

  3. Hanisch, D., Fundel, K., Mevissen, H.-T., Zimmer, R., Fluck, J.: ProMiner: rule-based protein and gene entity recognition. BMC Bioinform. 6(1), S14 (2005)

    Google Scholar 

  4. Nadeau, D., Sekine, S.: A survey of named entity recognition and classification. Lingvist. Investig. 30(1), 3–26 (2007)

    Article  Google Scholar 

  5. Eddy, S.R.: Hidden Markov models. Curr. Opin. Struct. Biol. 6(3), 361–365 (1996)

    Article  Google Scholar 

  6. Quinlan, J.R.: Induction of decision trees. Mach. Learn. 1(1), 81–106 (1986)

    Article  Google Scholar 

  7. Kapur, J.N.: Maximum-Entropy Models in Science and Engineering. Wiley, Hoboken (1989)

    MATH  Google Scholar 

  8. Lafferty, J.D., McCallum, A., Pereira, F.C.N.: Conditional random fields: probabilistic models for segmenting and labeling sequence data, pp. 282–289 (2001)

    Google Scholar 

  9. Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, pp. 3111–3119. MIT Press (2013)

    Google Scholar 

  10. Peters, M.E., et al.: Deep contextualized word representations. ar**v:1802.05365 (2018)

  11. Li, X., et al.: A unified MRC framework for named entity recognition. ar**v preprint ar**v:1910.11476 (2019)

  12. Devlin, J., et al.: BERT: pre-training of deep bidirectional transformers for language understanding. ar**v preprint ar**v:1810.04805 (2018)

  13. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008. MIT Press (2017)

    Google Scholar 

  14. Tripathy, J.K., et al.: Comprehensive analysis of embeddings and pre-training in NLP. Comput. Sci. Rev. 42, 100433 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  15. Kwiatkowski, T., et al.: Natural questions: a benchmark for question answering research. Trans. Assoc. Comput. Linguist. 7, 453–466 (2019)

    Article  Google Scholar 

  16. Souza, F., Nogueira, R., Lotufo, R.: Portuguese named entity recognition using BERT-CRF. ar**v preprint ar**v:1909.10649 (2019)

  17. Fu, J., Huang, X., Liu, P.: SpanNER: named entity recognition as span prediction. ar**v preprint ar**v:2106.00641 (2021)

  18. Gao, W., Zheng, X., Zhao, S.: Named entity recognition method of Chinese EMR based on BERT-BiLSTM-CRF. In: Journal of Physics: Conference Series, vol. 1848, no. 1. IOP Publishing (2021)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lei Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, S. et al. (2023). Chinese Medical Named Entity Recognition Based on Label Knowledge Enhancement. In: Wang, Z., Wang, S., Xu, H. (eds) Service Science. ICSS 2023. Communications in Computer and Information Science, vol 1844. Springer, Singapore. https://doi.org/10.1007/978-981-99-4402-6_21

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-4402-6_21

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-4401-9

  • Online ISBN: 978-981-99-4402-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation