Software testing with large language models: Survey, landscape, and vision

J Wang, Y Huang, C Chen, Z Liu… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Pre-trained large language models (LLMs) have recently emerged as a breakthrough
technology in natural language processing and artificial intelligence, with the ability to …

Efficient utilization of pre-trained models: A review of sentiment analysis via prompt learning

K Bu, Y Liu, X Ju - Knowledge-Based Systems, 2024 - Elsevier
Sentiment analysis is one of the traditional well-known tasks in Natural Language
Processing (NLP) research. In recent years, Pre-trained Models (PMs) have become one of …

Graphgpt: Graph instruction tuning for large language models

J Tang, Y Yang, W Wei, L Shi, L Su, S Cheng… - Proceedings of the 47th …, 2024 - dl.acm.org
Graph Neural Networks (GNNs) have evolved to understand graph structures through
recursive exchanges and aggregations among nodes. To enhance robustness, self …

All in one and one for all: A simple yet effective method towards cross-domain graph pretraining

H Zhao, A Chen, X Sun, H Cheng, J Li - Proceedings of the 30th ACM …, 2024 - dl.acm.org
Large Language Models (LLMs) have revolutionized the fields of computer vision (CV) and
natural language processing (NLP). One of the most notable advancements of LLMs is that a …

Towards graph foundation models: A survey and beyond

J Liu, C Yang, Z Lu, J Chen, Y Li, M Zhang… - arxiv preprint arxiv …, 2023 - arxiv.org
Emerging as fundamental building blocks for diverse artificial intelligence applications,
foundation models have achieved notable success across natural language processing and …

Universal prompt tuning for graph neural networks

T Fang, Y Zhang, Y Yang, C Wang… - Advances in Neural …, 2024 - proceedings.neurips.cc
In recent years, prompt tuning has sparked a research surge in adapting pre-trained models.
Unlike the unified pre-training strategy employed in the language field, the graph field …

Zerog: Investigating cross-dataset zero-shot transferability in graphs

Y Li, P Wang, Z Li, JX Yu, J Li - Proceedings of the 30th ACM SIGKDD …, 2024 - dl.acm.org
With the development of foundation models such as large language models, zero-shot
transfer learning has become increasingly significant. This is highlighted by the generative …

Lovász principle for unsupervised graph representation learning

Z Sun, C Ding, J Fan - Advances in Neural Information …, 2024 - proceedings.neurips.cc
This paper focuses on graph-level representation learning that aims to represent graphs as
vectors that can be directly utilized in downstream tasks such as graph classification. We …

Hgprompt: Bridging homogeneous and heterogeneous graphs for few-shot prompt learning

X Yu, Y Fang, Z Liu, X Zhang - Proceedings of the AAAI Conference on …, 2024 - ojs.aaai.org
Graph neural networks (GNNs) and heterogeneous graph neural networks (HGNNs) are
prominent techniques for homogeneous and heterogeneous graph representation learning …

Graph prompt learning: A comprehensive survey and beyond

X Sun, J Zhang, X Wu, H Cheng, Y **ong… - arxiv preprint arxiv …, 2023 - arxiv.org
Artificial General Intelligence (AGI) has revolutionized numerous fields, yet its integration
with graph data, a cornerstone in our interconnected world, remains nascent. This paper …