Machine knowledge: Creation and curation of comprehensive knowledge bases

G Weikum, XL Dong, S Razniewski… - … and Trends® in …, 2021 - nowpublishers.com
Equip** machines with comprehensive knowledge of the world's entities and their
relationships has been a longstanding goal of AI. Over the last decade, large-scale …

A survey on neural open information extraction: Current status and future directions

S Zhou, B Yu, A Sun, C Long, J Li, H Yu, J Sun… - arxiv preprint arxiv …, 2022 - arxiv.org
Open Information Extraction (OpenIE) facilitates domain-independent discovery of relational
facts from large corpora. The technique well suits many open-world natural language …

Finetuned language models are zero-shot learners

J Wei, M Bosma, VY Zhao, K Guu, AW Yu… - arxiv preprint arxiv …, 2021 - arxiv.org
This paper explores a simple method for improving the zero-shot learning abilities of
language models. We show that instruction tuning--finetuning language models on a …

Metaicl: Learning to learn in context

S Min, M Lewis, L Zettlemoyer, H Hajishirzi - arxiv preprint arxiv …, 2021 - arxiv.org
We introduce MetaICL (Meta-training for In-Context Learning), a new meta-training
framework for few-shot learning where a pretrained language model is tuned to do in …

The natural language decathlon: Multitask learning as question answering

B McCann, NS Keskar, C **ong, R Socher - arxiv preprint arxiv …, 2018 - arxiv.org
Deep learning has improved performance on many natural language processing (NLP)
tasks individually. However, general NLP models cannot emerge within a paradigm that …

Allennlp: A deep semantic natural language processing platform

M Gardner, J Grus, M Neumann, O Tafjord… - arxiv preprint arxiv …, 2018 - arxiv.org
This paper describes AllenNLP, a platform for research on deep learning methods in natural
language understanding. AllenNLP is designed to support researchers who want to build …

Intermediate-task transfer learning with pretrained models for natural language understanding: When and why does it work?

Y Pruksachatkun, J Phang, H Liu, PM Htut… - arxiv preprint arxiv …, 2020 - arxiv.org
While pretrained models such as BERT have shown large gains across natural language
understanding tasks, their performance can be improved by further training the model on a …

DeepStruct: Pretraining of language models for structure prediction

C Wang, X Liu, Z Chen, H Hong, J Tang… - arxiv preprint arxiv …, 2022 - arxiv.org
We introduce a method for improving the structural understanding abilities of language
models. Unlike previous approaches that finetune the models with task-specific …

Zero-shot relation extraction via reading comprehension

O Levy, M Seo, E Choi, L Zettlemoyer - arxiv preprint arxiv:1706.04115, 2017 - arxiv.org
We show that relation extraction can be reduced to answering simple reading
comprehension questions, by associating one or more natural-language questions with …

Crossfit: A few-shot learning challenge for cross-task generalization in nlp

Q Ye, BY Lin, X Ren - arxiv preprint arxiv:2104.08835, 2021 - arxiv.org
Humans can learn a new language task efficiently with only few examples, by leveraging
their knowledge obtained when learning prior tasks. In this paper, we explore whether and …