![Loading...](https://link.springer.com/static/c4a417b97a76cc2980e3c25e2271af3129e08bbe/images/pdf-preview/spacer.gif)
-
Chapter and Conference Paper
Knowledgeable Salient Span Mask for Enhancing Language Models as Knowledge Base
Pre-trained language models (PLMs) like BERT have made significant progress in various downstream NLP tasks. However, by asking models to do cloze-style tests, recent work finds that PLMs are short in acquirin...
-
Chapter and Conference Paper
Exploring Generalization Ability of Pretrained Language Models on Arithmetic and Logical Reasoning
To quantitatively and intuitively explore the generalization ability of pre-trained language models (PLMs), we have designed several tasks of arithmetic and logical reasoning. We both analyse how well PLMs gen...
-
Chapter and Conference Paper
Domain Representation for Knowledge Graph Embedding
Embedding entities and relations into a continuous multi-dimensional vector space have become the dominant method for knowledge graph embedding in representation learning. However, most existing models ignore...
-
Chapter and Conference Paper
Embedding Syntactic Tree Structures into CNN Architecture for Relation Classification
Relation classification is an important task in natural language processing (NLP) fields. State-of-the-art methods are mainly based on deep neural networks. This paper proposes a new convolutional neural netw...