A survey of knowledge enhanced pre-trained language models

L Hu, Z Liu, Z Zhao, L Hou, L Nie… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …

Chatgpt is not enough: Enhancing large language models with knowledge graphs for fact-aware language modeling

L Yang, H Chen, Z Li, X Ding, X Wu - arxiv preprint arxiv:2306.11489, 2023 - arxiv.org
Recently, ChatGPT, a representative large language model (LLM), has gained considerable
attention due to its powerful emergent abilities. Some researchers suggest that LLMs could …

Unifying large language models and knowledge graphs: A roadmap

S Pan, L Luo, Y Wang, C Chen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Large language models (LLMs), such as ChatGPT and GPT4, are making new waves in the
field of natural language processing and artificial intelligence, due to their emergent ability …

Give us the facts: Enhancing large language models with knowledge graphs for fact-aware language modeling

L Yang, H Chen, Z Li, X Ding… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Recently, ChatGPT, a representative large language model (LLM), has gained considerable
attention. Due to their powerful emergent abilities, recent LLMs are considered as a possible …

A divide and conquer framework for Knowledge Editing

X Han, R Li, X Li, JZ Pan - Knowledge-Based Systems, 2023 - Elsevier
As Pre-trained language models (LMs) play an important role in various Natural Language
Processing (NLP) tasks, it is becoming increasingly important to make sure the knowledge …

[HTML][HTML] Faithful AI in medicine: a systematic review with large language models and beyond

Q **e, EJ Schenck, HS Yang, Y Chen, Y Peng, F Wang - MedRxiv, 2023 - ncbi.nlm.nih.gov
Artificial intelligence (AI), especially the most recent large language models (LLMs), holds
great promise in healthcare and medicine, with applications spanning from biological …

HugNLP: A Unified and Comprehensive Library for Natural Language Processing

J Wang, N Chen, Q Sun, W Huang, C Wang… - Proceedings of the 32nd …, 2023 - dl.acm.org
In this paper, we introduce HugNLP, a unified and comprehensive library for natural
language processing (NLP) with the prevalent backend of Hugging Face Transformers …

Knowledge-Enhanced Language Models Are Not Bias-Proof: Situated Knowledge and Epistemic Injustice in AI

A Kraft, E Soulier - The 2024 ACM Conference on Fairness …, 2024 - dl.acm.org
The factual inaccuracies (" hallucinations") of large language models have recently inspired
more research on knowledge-enhanced language modeling approaches. These are often …

KG-prompt: Interpretable knowledge graph prompt for pre-trained language models

L Chen, J Liu, Y Duan, R Wang - Knowledge-Based Systems, 2025 - Elsevier
Abstract Knowledge graphs (KGs) can provide rich factual knowledge for language models,
enhancing reasoning ability and interpretability. However, existing knowledge injection …

OCEAN: Offline Chain-of-thought Evaluation and Alignment in Large Language Models

J Wu, X Li, R Wang, Y **a, Y **ong, J Wang… - arxiv preprint arxiv …, 2024 - arxiv.org
Offline evaluation of LLMs is crucial in understanding their capacities, though current
methods remain underexplored in existing research. In this work, we focus on the offline …