Dynamic graph enhanced contrastive learning for chest x-ray report generation
Automatic radiology reporting has great clinical potential to relieve radiologists from heavy
workloads and improve diagnosis interpretation. Recently, researchers have enhanced data …
workloads and improve diagnosis interpretation. Recently, researchers have enhanced data …
Federated learning from pre-trained models: A contrastive learning approach
Federated Learning (FL) is a machine learning paradigm that allows decentralized clients to
learn collaboratively without sharing their private data. However, excessive computation and …
learn collaboratively without sharing their private data. However, excessive computation and …
Reasoning on graphs: Faithful and interpretable large language model reasoning
Large language models (LLMs) have demonstrated impressive reasoning abilities in
complex tasks. However, they lack up-to-date knowledge and experience hallucinations …
complex tasks. However, they lack up-to-date knowledge and experience hallucinations …
Dink-net: Neural clustering on large graphs
Deep graph clustering, which aims to group the nodes of a graph into disjoint clusters with
deep neural networks, has achieved promising progress in recent years. However, the …
deep neural networks, has achieved promising progress in recent years. However, the …
Graphmae2: A decoding-enhanced masked self-supervised graph learner
Graph self-supervised learning (SSL), including contrastive and generative approaches,
offers great potential to address the fundamental challenge of label scarcity in real-world …
offers great potential to address the fundamental challenge of label scarcity in real-world …
Neural temporal walks: Motif-aware representation learning on continuous-time dynamic graphs
Continuous-time dynamic graphs naturally abstract many real-world systems, such as social
and transactional networks. While the research on continuous-time dynamic graph …
and transactional networks. While the research on continuous-time dynamic graph …
Structure-free graph condensation: From large-scale graphs to condensed graph-free data
Graph condensation, which reduces the size of a large-scale graph by synthesizing a small-
scale condensed graph as its substitution, has immediate benefits for various graph learning …
scale condensed graph as its substitution, has immediate benefits for various graph learning …
Beyond smoothing: Unsupervised graph representation learning with edge heterophily discriminating
Unsupervised graph representation learning (UGRL) has drawn increasing research
attention and achieved promising results in several graph analytic tasks. Relying on the …
attention and achieved promising results in several graph analytic tasks. Relying on the …
Finding the missing-half: Graph complementary learning for homophily-prone and heterophily-prone graphs
Real-world graphs generally have only one kind of tendency in their connections. These
connections are either homophilic-prone or heterophily-prone. While graphs with homophily …
connections are either homophilic-prone or heterophily-prone. While graphs with homophily …
Good-d: On unsupervised graph out-of-distribution detection
Most existing deep learning models are trained based on the closed-world assumption,
where the test data is assumed to be drawn iid from the same distribution as the training …
where the test data is assumed to be drawn iid from the same distribution as the training …