QA-GNN: Reasoning with language models and knowledge graphs for question answering

M Yasunaga, H Ren, A Bosselut, P Liang… - ar** into people annoys them” or
“rain makes the road slippery”, helps humans navigate everyday situations seamlessly. Yet …

Connecting the dots: A knowledgeable path generator for commonsense question answering

P Wang, N Peng, F Ilievski, P Szekely… - arxiv preprint arxiv …, 2020 - arxiv.org
Commonsense question answering (QA) requires background knowledge which is not
explicitly stated in a given context. Prior works use commonsense knowledge graphs (KGs) …

Genericskb: A knowledge base of generic statements

S Bhakthavatsalam, C Anastasiades… - arxiv preprint arxiv …, 2020 - arxiv.org
We present a new resource for the NLP community, namely a large (3.5 M+ sentence)
knowledge base of* generic statements*, eg," Trees remove carbon dioxide from the …

A roadmap for big model

S Yuan, H Zhao, S Zhao, J Leng, Y Liang… - arxiv preprint arxiv …, 2022 - arxiv.org
With the rapid development of deep learning, training Big Models (BMs) for multiple
downstream tasks becomes a popular paradigm. Researchers have achieved various …

MICO: A multi-alternative contrastive learning framework for commonsense knowledge representation

Y Su, Z Wang, T Fang, H Zhang, Y Song… - arxiv preprint arxiv …, 2022 - arxiv.org
Commonsense reasoning tasks such as commonsense knowledge graph completion and
commonsense question answering require powerful representation learning. In this paper …

What makes the story forward? inferring commonsense explanations as prompts for future event generation

L Lin, Y Cao, L Huang, SA Li, X Hu, L Wen… - Proceedings of the 45th …, 2022 - dl.acm.org
Prediction over event sequences is critical for many real-world applications in Information
Retrieval and Natural Language Processing. Future Event Generation (FEG) is a …