Large-scale multi-modal pre-trained models: A comprehensive survey

X Wang, G Chen, G Qian, P Gao, XY Wei… - Machine Intelligence …, 2023 - Springer
With the urgent demand for generalized deep models, many pre-trained big models are
proposed, such as bidirectional encoder representations (BERT), vision transformer (ViT) …

A survey of knowledge enhanced pre-trained language models

L Hu, Z Liu, Z Zhao, L Hou, L Nie… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …

[PDF][PDF] A survey of large language models

WX Zhao, K Zhou, J Li, T Tang… - arxiv preprint arxiv …, 2023 - paper-notes.zhjwpku.com
Ever since the Turing Test was proposed in the 1950s, humans have explored the mastering
of language intelligence by machine. Language is essentially a complex, intricate system of …

A comprehensive survey on automatic knowledge graph construction

L Zhong, J Wu, Q Li, H Peng, X Wu - ACM Computing Surveys, 2023 - dl.acm.org
Automatic knowledge graph construction aims at manufacturing structured human
knowledge. To this end, much effort has historically been spent extracting informative fact …

Graph neural networks: foundation, frontiers and applications

L Wu, P Cui, J Pei, L Zhao, X Guo - … of the 28th ACM SIGKDD conference …, 2022 - dl.acm.org
The field of graph neural networks (GNNs) has seen rapid and incredible strides over the
recent years. Graph neural networks, also known as deep learning on graphs, graph …

Graph neural networks for natural language processing: A survey

L Wu, Y Chen, K Shen, X Guo, H Gao… - … and Trends® in …, 2023 - nowpublishers.com
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …

Simkgc: Simple contrastive knowledge graph completion with pre-trained language models

L Wang, W Zhao, Z Wei, J Liu - arxiv preprint arxiv:2203.02167, 2022 - arxiv.org
Knowledge graph completion (KGC) aims to reason over known facts and infer the missing
links. Text-based methods such as KGBERT (Yao et al., 2019) learn entity representations …

Neural bellman-ford networks: A general graph neural network framework for link prediction

Z Zhu, Z Zhang, LP Xhonneux… - Advances in neural …, 2021 - proceedings.neurips.cc
Link prediction is a very fundamental task on graphs. Inspired by traditional path-based
methods, in this paper we propose a general and flexible representation learning framework …

LLMs4OL: Large language models for ontology learning

H Babaei Giglou, J D'Souza, S Auer - International Semantic Web …, 2023 - Springer
We propose the LLMs4OL approach, which utilizes Large Language Models (LLMs) for
Ontology Learning (OL). LLMs have shown significant advancements in natural language …

A survey of knowledge graph reasoning on graph types: Static, dynamic, and multi-modal

K Liang, L Meng, M Liu, Y Liu, W Tu… - … on Pattern Analysis …, 2024 - ieeexplore.ieee.org
Knowledge graph reasoning (KGR), aiming to deduce new facts from existing facts based on
mined logic rules underlying knowledge graphs (KGs), has become a fast-growing research …