Give us the facts: Enhancing large language models with knowledge graphs for fact-aware language modeling

L Yang, H Chen, Z Li, X Ding… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Recently, ChatGPT, a representative large language model (LLM), has gained considerable
attention. Due to their powerful emergent abilities, recent LLMs are considered as a possible …

Pre-trained language models for text generation: A survey

J Li, T Tang, WX Zhao, JY Nie, JR Wen - ACM Computing Surveys, 2024 - dl.acm.org
Text Generation aims to produce plausible and readable text in human language from input
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …

[PDF][PDF] A survey of large language models

WX Zhao, K Zhou, J Li, T Tang… - arxiv preprint arxiv …, 2023 - paper-notes.zhjwpku.com
Ever since the Turing Test was proposed in the 1950s, humans have explored the mastering
of language intelligence by machine. Language is essentially a complex, intricate system of …

Unifying large language models and knowledge graphs: A roadmap

S Pan, L Luo, Y Wang, C Chen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Large language models (LLMs), such as ChatGPT and GPT4, are making new waves in the
field of natural language processing and artificial intelligence, due to their emergent ability …

Graph neural networks for natural language processing: A survey

L Wu, Y Chen, K Shen, X Guo, H Gao… - … and Trends® in …, 2023 - nowpublishers.com
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …

Large language models and knowledge graphs: Opportunities and challenges

JZ Pan, S Razniewski, JC Kalo, S Singhania… - arxiv preprint arxiv …, 2023 - arxiv.org
Large Language Models (LLMs) have taken Knowledge Representation--and the world--by
storm. This inflection point marks a shift from explicit knowledge representation to a renewed …

[HTML][HTML] Pre-trained models: Past, present and future

X Han, Z Zhang, N Ding, Y Gu, X Liu, Y Huo, J Qiu… - AI Open, 2021 - Elsevier
Large-scale pre-trained models (PTMs) such as BERT and GPT have recently achieved
great success and become a milestone in the field of artificial intelligence (AI). Owing to …

Neurologic a* esque decoding: Constrained text generation with lookahead heuristics

X Lu, S Welleck, P West, L Jiang, J Kasai… - arxiv preprint arxiv …, 2021 - arxiv.org
The dominant paradigm for neural text generation is left-to-right decoding from
autoregressive language models. Constrained or controllable generation under complex …

Knowledge graph based synthetic corpus generation for knowledge-enhanced language model pre-training

O Agarwal, H Ge, S Shakeri, R Al-Rfou - arxiv preprint arxiv:2010.12688, 2020 - arxiv.org
Prior work on Data-To-Text Generation, the task of converting knowledge graph (KG) triples
into natural text, focused on domain-specific benchmark datasets. In this paper, however, we …

Language models are few-shot multilingual learners

GI Winata, A Madotto, Z Lin, R Liu, J Yosinski… - arxiv preprint arxiv …, 2021 - arxiv.org
General-purpose language models have demonstrated impressive capabilities, performing
on par with state-of-the-art approaches on a range of downstream natural language …