Dynamic graph enhanced contrastive learning for chest x-ray report generation

M Li, B Lin, Z Chen, H Lin, X Liang… - Proceedings of the …, 2023 - openaccess.thecvf.com
Automatic radiology reporting has great clinical potential to relieve radiologists from heavy
workloads and improve diagnosis interpretation. Recently, researchers have enhanced data …

Federated learning from pre-trained models: A contrastive learning approach

Y Tan, G Long, J Ma, L Liu, T Zhou… - Advances in neural …, 2022 - proceedings.neurips.cc
Federated Learning (FL) is a machine learning paradigm that allows decentralized clients to
learn collaboratively without sharing their private data. However, excessive computation and …

Reasoning on graphs: Faithful and interpretable large language model reasoning

L Luo, YF Li, G Haffari, S Pan - arxiv preprint arxiv:2310.01061, 2023 - arxiv.org
Large language models (LLMs) have demonstrated impressive reasoning abilities in
complex tasks. However, they lack up-to-date knowledge and experience hallucinations …

Dink-net: Neural clustering on large graphs

Y Liu, K Liang, J **a, S Zhou, X Yang… - International …, 2023 - proceedings.mlr.press
Deep graph clustering, which aims to group the nodes of a graph into disjoint clusters with
deep neural networks, has achieved promising progress in recent years. However, the …

Graphmae2: A decoding-enhanced masked self-supervised graph learner

Z Hou, Y He, Y Cen, X Liu, Y Dong… - Proceedings of the …, 2023 - dl.acm.org
Graph self-supervised learning (SSL), including contrastive and generative approaches,
offers great potential to address the fundamental challenge of label scarcity in real-world …

Neural temporal walks: Motif-aware representation learning on continuous-time dynamic graphs

M **, YF Li, S Pan - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Continuous-time dynamic graphs naturally abstract many real-world systems, such as social
and transactional networks. While the research on continuous-time dynamic graph …

Structure-free graph condensation: From large-scale graphs to condensed graph-free data

X Zheng, M Zhang, C Chen… - Advances in …, 2024 - proceedings.neurips.cc
Graph condensation, which reduces the size of a large-scale graph by synthesizing a small-
scale condensed graph as its substitution, has immediate benefits for various graph learning …

Beyond smoothing: Unsupervised graph representation learning with edge heterophily discriminating

Y Liu, Y Zheng, D Zhang, VCS Lee, S Pan - Proceedings of the AAAI …, 2023 - ojs.aaai.org
Unsupervised graph representation learning (UGRL) has drawn increasing research
attention and achieved promising results in several graph analytic tasks. Relying on the …

Finding the missing-half: Graph complementary learning for homophily-prone and heterophily-prone graphs

Y Zheng, H Zhang, V Lee, Y Zheng… - International …, 2023 - proceedings.mlr.press
Real-world graphs generally have only one kind of tendency in their connections. These
connections are either homophilic-prone or heterophily-prone. While graphs with homophily …

Good-d: On unsupervised graph out-of-distribution detection

Y Liu, K Ding, H Liu, S Pan - … Conference on Web Search and Data …, 2023 - dl.acm.org
Most existing deep learning models are trained based on the closed-world assumption,
where the test data is assumed to be drawn iid from the same distribution as the training …