Denoising self-attentive sequential recommendation
Transformer-based sequential recommenders are very powerful for capturing both short-
term and long-term sequential item dependencies. This is mainly attributed to their unique …
term and long-term sequential item dependencies. This is mainly attributed to their unique …
Toward a foundation model for time series data
A foundation model is a machine learning model trained on a large and diverse set of data,
typically using self-supervised learning-based pre-training techniques, that can be adapted …
typically using self-supervised learning-based pre-training techniques, that can be adapted …
Sharpness-aware graph collaborative filtering
Graph Neural Networks (GNNs) have achieved impressive performance in collaborative
filtering. However, recent studies show that GNNs tend to yield inferior performance when …
filtering. However, recent studies show that GNNs tend to yield inferior performance when …
Tinykg: Memory-efficient training framework for knowledge graph neural recommender systems
There has been an explosion of interest in designing various Knowledge Graph Neural
Networks (KGNNs), which achieve state-of-the-art performance and provide great …
Networks (KGNNs), which achieve state-of-the-art performance and provide great …
Rpmixer: Shaking up time series forecasting with random projections for large spatial-temporal data
Spatial-temporal forecasting systems play a crucial role in addressing numerous real-world
challenges. In this paper, we investigate the potential of addressing spatial-temporal …
challenges. In this paper, we investigate the potential of addressing spatial-temporal …
Towards mitigating dimensional collapse of representations in collaborative filtering
Contrastive Learning (CL) has shown promising performance in collaborative filtering. The
key idea is to use contrastive loss to generate augmentation-invariant embeddings by …
key idea is to use contrastive loss to generate augmentation-invariant embeddings by …
Masked graph transformer for large-scale recommendation
Graph Transformers have garnered significant attention for learning graph-structured data,
thanks to their superb ability to capture long-range dependencies among nodes. However …
thanks to their superb ability to capture long-range dependencies among nodes. However …
Enhancing Transformers without Self-supervised Learning: A Loss Landscape Perspective in Sequential Recommendation
Transformer and its variants are a powerful class of architectures for sequential
recommendation, owing to their ability of capturing a user's dynamic interests from their past …
recommendation, owing to their ability of capturing a user's dynamic interests from their past …
Learning to hash for trajectory similarity computation and search
Searching for similar trajectories from a database is an important way for extracting human-
understandable knowledge. However, due to the huge volume of trajectories and high …
understandable knowledge. However, due to the huge volume of trajectories and high …
An efficient content-based time series retrieval system
A Content-based Time Series Retrieval (CTSR) system is an information retrieval system for
users to interact with time series emerged from multiple domains, such as finance …
users to interact with time series emerged from multiple domains, such as finance …