A comprehensive survey on automatic knowledge graph construction

L Zhong, J Wu, Q Li, H Peng, X Wu - ACM Computing Surveys, 2023‏ - dl.acm.org
Automatic knowledge graph construction aims at manufacturing structured human
knowledge. To this end, much effort has historically been spent extracting informative fact …

Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing

P Liu, W Yuan, J Fu, Z Jiang, H Hayashi… - ACM computing …, 2023‏ - dl.acm.org
This article surveys and organizes research works in a new paradigm in natural language
processing, which we dub “prompt-based learning.” Unlike traditional supervised learning …

Prompting gpt-3 to be reliable

C Si, Z Gan, Z Yang, S Wang, J Wang… - arxiv preprint arxiv …, 2022‏ - arxiv.org
Large language models (LLMs) show impressive abilities via few-shot prompting.
Commercialized APIs such as OpenAI GPT-3 further increase their use in real-world …

Large language models are few-shot clinical information extractors

M Agrawal, S Hegselmann, H Lang, Y Kim… - arxiv preprint arxiv …, 2022‏ - arxiv.org
A long-running goal of the clinical NLP community is the extraction of important variables
trapped in clinical notes. However, roadblocks have included dataset shift from the general …

Unified structure generation for universal information extraction

Y Lu, Q Liu, D Dai, X **ao, H Lin, X Han, L Sun… - arxiv preprint arxiv …, 2022‏ - arxiv.org
Information extraction suffers from its varying targets, heterogeneous structures, and
demand-specific schemas. In this paper, we propose a unified text-to-structure generation …

Knowledge enhanced contextual word representations

ME Peters, M Neumann, RL Logan IV… - arxiv preprint arxiv …, 2019‏ - arxiv.org
Contextual word representations, typically trained on unstructured, unlabeled text, do not
contain any explicit grounding to real world entities and are often unable to remember facts …

Superglue: A stickier benchmark for general-purpose language understanding systems

A Wang, Y Pruksachatkun, N Nangia… - Advances in neural …, 2019‏ - proceedings.neurips.cc
In the last year, new models and methods for pretraining and transfer learning have driven
striking performance improvements across a range of language understanding tasks. The …

Learning span-level interactions for aspect sentiment triplet extraction

L Xu, YK Chia, L Bing - arxiv preprint arxiv:2107.12214, 2021‏ - arxiv.org
Aspect Sentiment Triplet Extraction (ASTE) is the most recent subtask of ABSA which outputs
triplets of an aspect target, its associated sentiment, and the corresponding opinion term …

Spanbert: Improving pre-training by representing and predicting spans

M Joshi, D Chen, Y Liu, DS Weld… - Transactions of the …, 2020‏ - direct.mit.edu
We present SpanBERT, a pre-training method that is designed to better represent and
predict spans of text. Our approach extends BERT by (1) masking contiguous random spans …

Span-based joint entity and relation extraction with transformer pre-training

M Eberts, A Ulges - ECAI 2020, 2020‏ - ebooks.iospress.nl
We introduce SpERT, an attention model for span-based joint entity and relation extraction.
Our key contribution is a light-weight reasoning on BERT embeddings, which features entity …