Biomaterials and bioelectronics for self-powered neurostimulation
Self-powered neurostimulation via biomaterials and bioelectronics innovation has emerged
as a compelling approach to explore, repair, and modulate neural systems. This review …
as a compelling approach to explore, repair, and modulate neural systems. This review …
Simgrace: A simple framework for graph contrastive learning without data augmentation
Graph contrastive learning (GCL) has emerged as a dominant technique for graph
representation learning which maximizes the mutual information between paired graph …
representation learning which maximizes the mutual information between paired graph …
Cvt-slr: Contrastive visual-textual transformation for sign language recognition with variational alignment
Sign language recognition (SLR) is a weakly supervised task that annotates sign videos as
textual glosses. Recent studies show that insufficient training caused by the lack of large …
textual glosses. Recent studies show that insufficient training caused by the lack of large …
Temporal attention unit: Towards efficient spatiotemporal predictive learning
Spatiotemporal predictive learning aims to generate future frames by learning from historical
frames. In this paper, we investigate existing methods and present a general framework of …
frames. In this paper, we investigate existing methods and present a general framework of …
Mole-bert: Rethinking pre-training graph neural networks for molecules
Recent years have witnessed the prosperity of pre-training graph neural networks (GNNs)
for molecules. Typically, atom types as node attributes are randomly masked and GNNs are …
for molecules. Typically, atom types as node attributes are randomly masked and GNNs are …
A survey on multilingual large language models: Corpora, alignment, and bias
Y Xu, L Hu, J Zhao, Z Qiu, K XU, Y Ye, H Gu - arxiv preprint arxiv …, 2024 - arxiv.org
Based on the foundation of Large Language Models (LLMs), Multilingual LLMs (MLLMs)
have been developed to address the challenges faced in multilingual natural language …
have been developed to address the challenges faced in multilingual natural language …
[HTML][HTML] Extracting sentence embeddings from pretrained transformer models
Pre-trained transformer models shine in many natural language processing tasks and
therefore are expected to bear the representation of the input sentence or text meaning …
therefore are expected to bear the representation of the input sentence or text meaning …
A systematic survey of chemical pre-trained models
Deep learning has achieved remarkable success in learning representations for molecules,
which is crucial for various biochemical applications, ranging from property prediction to …
which is crucial for various biochemical applications, ranging from property prediction to …
Metaenzyme: Meta pan-enzyme learning for task-adaptive redesign
Enzyme design plays a crucial role in both industrial production and biology. However, this
field faces challenges due to the lack of comprehensive benchmarks and the complexity of …
field faces challenges due to the lack of comprehensive benchmarks and the complexity of …
A fistful of vectors: a tool for intrinsic evaluation of word embeddings
The utilization of word embeddings—powerful models computed through Neural Network
architectures that encode words as vectors—has witnessed rapid growth across various …
architectures that encode words as vectors—has witnessed rapid growth across various …