Vision-language pre-training: Basics, recent advances, and future trends

Z Gan, L Li, C Li, L Wang, Z Liu… - Foundations and Trends …, 2022 - nowpublishers.com
This monograph surveys vision-language pre-training (VLP) methods for multimodal
intelligence that have been developed in the last few years. We group these approaches …

Large-scale multi-modal pre-trained models: A comprehensive survey

X Wang, G Chen, G Qian, P Gao, XY Wei… - Machine Intelligence …, 2023 - Springer
With the urgent demand for generalized deep models, many pre-trained big models are
proposed, such as bidirectional encoder representations (BERT), vision transformer (ViT) …

Unifying large language models and knowledge graphs: A roadmap

S Pan, L Luo, Y Wang, C Chen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Large language models (LLMs), such as ChatGPT and GPT4, are making new waves in the
field of natural language processing and artificial intelligence, due to their emergent ability …

Multimodal foundation models: From specialists to general-purpose assistants

C Li, Z Gan, Z Yang, J Yang, L Li… - … and Trends® in …, 2024 - nowpublishers.com
Neural compression is the application of neural networks and other machine learning
methods to data compression. Recent advances in statistical machine learning have opened …

Can knowledge graphs reduce hallucinations in llms?: A survey

G Agrawal, T Kumarage, Z Alghamdi, H Liu - arxiv preprint arxiv …, 2023 - arxiv.org
The contemporary LLMs are prone to producing hallucinations, stemming mainly from the
knowledge gaps within the models. To address this critical limitation, researchers employ …

A survey of knowledge enhanced pre-trained language models

L Hu, Z Liu, Z Zhao, L Hou, L Nie… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …

K-lite: Learning transferable visual models with external knowledge

S Shen, C Li, X Hu, Y **e, J Yang… - Advances in …, 2022 - proceedings.neurips.cc
The new generation of state-of-the-art computer vision systems are trained from natural
language supervision, ranging from simple object category names to descriptive captions …

Evaluating large language models on graphs: Performance insights and comparative analysis

C Liu, B Wu - arxiv preprint arxiv:2308.11224, 2023 - arxiv.org
Large Language Models (LLMs) have garnered considerable interest within both academic
and industrial. Yet, the application of LLMs to graph data remains under-explored. In this …

The life cycle of knowledge in big language models: A survey

B Cao, H Lin, X Han, L Sun - Machine Intelligence Research, 2024 - Springer
Abstract Knowledge plays a critical role in artificial intelligence. Recently, the extensive
success of pre-trained language models (PLMs) has raised significant attention about how …

Comfact: A benchmark for linking contextual commonsense knowledge

S Gao, JD Hwang, S Kanno, H Wakaki… - arxiv preprint arxiv …, 2022 - arxiv.org
Understanding rich narratives, such as dialogues and stories, often requires natural
language processing systems to access relevant knowledge from commonsense knowledge …