Geolocation representation from large language models are generic enhancers for spatio-temporal learning

J He, T Nie, W Ma - arxiv preprint arxiv:2408.12116, 2024 - arxiv.org
In the geospatial domain, universal representation models are significantly less prevalent
than their extensive use in natural language processing and computer vision. This …

A Comprehensive Survey of Time Series Forecasting: Architectural Diversity and Open Challenges

J Kim, H Kim, HG Kim, D Lee, S Yoon - arxiv preprint arxiv:2411.05793, 2024 - arxiv.org
Time series forecasting is a critical task that provides key information for decision-making
across various fields. Recently, various fundamental deep learning architectures such as …

Contextualizing MLP-mixers spatiotemporally for urban traffic data forecast at scale

T Nie, G Qin, L Sun, W Ma, Y Mei… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Spatiotemporal traffic data (STTD) displays complex correlational structures. Extensive
advanced techniques have been designed to capture these structures for effective …

Low-Rank Adaptation for Foundation Models: A Comprehensive Review

M Yang, J Chen, Y Zhang, J Liu, J Zhang, Q Ma… - arxiv preprint arxiv …, 2024 - arxiv.org
The rapid advancement of foundation modelslarge-scale neural networks trained on
diverse, extensive datasetshas revolutionized artificial intelligence, enabling unprecedented …

Collaborative imputation of urban time series through cross-city meta-learning

T Nie, W Ma, J Sun, Y Yang, J Cao - arxiv preprint arxiv:2501.11306, 2025 - arxiv.org
Urban time series, such as mobility flows, energy consumption, and pollution records,
encapsulate complex urban dynamics and structures. However, data collection in each city …

Mask autoencoder for enhanced image reconstruction with position coding offset and combined masking

Y Wang, H Wang, F Zhang - The Visual Computer, 2025 - Springer
Existing masked image modeling (MIM) methods mainly fill and enhance images by
modeling and filling masked areas. These techniques usually borrow principles from …