Skip to main content

and
  1. No Access

    Chapter and Conference Paper

    Knowledgeable Salient Span Mask for Enhancing Language Models as Knowledge Base

    Pre-trained language models (PLMs) like BERT have made significant progress in various downstream NLP tasks. However, by asking models to do cloze-style tests, recent work finds that PLMs are short in acquirin...

    Cunxiang Wang, Fuli Luo, Yanyang Li in Natural Language Processing and Chinese Co… (2023)

  2. No Access

    Chapter and Conference Paper

    Exploring Generalization Ability of Pretrained Language Models on Arithmetic and Logical Reasoning

    To quantitatively and intuitively explore the generalization ability of pre-trained language models (PLMs), we have designed several tasks of arithmetic and logical reasoning. We both analyse how well PLMs gen...

    Cunxiang Wang, Boyuan Zheng, Yuchen Niu in Natural Language Processing and Chinese Co… (2021)

  3. No Access

    Chapter and Conference Paper

    Domain Representation for Knowledge Graph Embedding

    Embedding entities and relations into a continuous multi-dimensional vector space have become the dominant method for knowledge graph embedding in representation learning. However, most existing models ignore...

    Cunxiang Wang, Feiliang Ren, Zhichao Lin in Natural Language Processing and Chinese Co… (2019)

  4. No Access

    Chapter and Conference Paper

    Embedding Syntactic Tree Structures into CNN Architecture for Relation Classification

    Relation classification is an important task in natural language processing (NLP) fields. State-of-the-art methods are mainly based on deep neural networks. This paper proposes a new convolutional neural netw...

    Feiliang Ren, Rongsheng Zhao, **ao Hu in Knowledge Graph and Semantic Computing. La… (2017)