Biomaterials and bioelectronics for self-powered neurostimulation

J Li, Z Che, X Wan, F Manshaii, J Xu, J Chen - Biomaterials, 2024 - Elsevier
Self-powered neurostimulation via biomaterials and bioelectronics innovation has emerged
as a compelling approach to explore, repair, and modulate neural systems. This review …

Simgrace: A simple framework for graph contrastive learning without data augmentation

J **a, L Wu, J Chen, B Hu, SZ Li - … of the ACM Web Conference 2022, 2022 - dl.acm.org
Graph contrastive learning (GCL) has emerged as a dominant technique for graph
representation learning which maximizes the mutual information between paired graph …

Cvt-slr: Contrastive visual-textual transformation for sign language recognition with variational alignment

J Zheng, Y Wang, C Tan, S Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Sign language recognition (SLR) is a weakly supervised task that annotates sign videos as
textual glosses. Recent studies show that insufficient training caused by the lack of large …

Temporal attention unit: Towards efficient spatiotemporal predictive learning

C Tan, Z Gao, L Wu, Y Xu, J **a… - Proceedings of the …, 2023 - openaccess.thecvf.com
Spatiotemporal predictive learning aims to generate future frames by learning from historical
frames. In this paper, we investigate existing methods and present a general framework of …

Mole-bert: Rethinking pre-training graph neural networks for molecules

J **a, C Zhao, B Hu, Z Gao, C Tan, Y Liu, S Li, SZ Li - 2023 - chemrxiv.org
Recent years have witnessed the prosperity of pre-training graph neural networks (GNNs)
for molecules. Typically, atom types as node attributes are randomly masked and GNNs are …

A survey on multilingual large language models: Corpora, alignment, and bias

Y Xu, L Hu, J Zhao, Z Qiu, K XU, Y Ye, H Gu - arxiv preprint arxiv …, 2024 - arxiv.org
Based on the foundation of Large Language Models (LLMs), Multilingual LLMs (MLLMs)
have been developed to address the challenges faced in multilingual natural language …

[HTML][HTML] Extracting sentence embeddings from pretrained transformer models

L Stankevičius, M Lukoševičius - Applied Sciences, 2024 - mdpi.com
Pre-trained transformer models shine in many natural language processing tasks and
therefore are expected to bear the representation of the input sentence or text meaning …

A systematic survey of chemical pre-trained models

J **a, Y Zhu, Y Du, SZ Li - arxiv preprint arxiv:2210.16484, 2022 - arxiv.org
Deep learning has achieved remarkable success in learning representations for molecules,
which is crucial for various biochemical applications, ranging from property prediction to …

Metaenzyme: Meta pan-enzyme learning for task-adaptive redesign

J Zheng, H Zhang, Q Xu, AP Zeng, SZ Li - Proceedings of the 32nd ACM …, 2024 - dl.acm.org
Enzyme design plays a crucial role in both industrial production and biology. However, this
field faces challenges due to the lack of comprehensive benchmarks and the complexity of …

A fistful of vectors: a tool for intrinsic evaluation of word embeddings

R Ascari, A Giabelli, L Malandri, F Mercorio… - Cognitive …, 2024 - Springer
The utilization of word embeddings—powerful models computed through Neural Network
architectures that encode words as vectors—has witnessed rapid growth across various …