Knowledge-graph-enabled biomedical entity linking: a survey

J Shi, Z Yuan, W Guo, C Ma, J Chen, M Zhang - World Wide Web, 2023 - Springer
Abstract Biomedical Entity Linking (BM-EL) task, which aims to match biomedical mentions
in articles to entities in a certain knowledge base (eg, the Unified Medical Language …

Configurable foundation models: Building llms from a modular perspective

C **ao, Z Zhang, C Song, D Jiang, F Yao, X Han… - arxiv preprint arxiv …, 2024 - arxiv.org
Advancements in LLMs have recently unveiled challenges tied to computational efficiency
and continual scalability due to their requirements of huge parameters, making the …

Plug-and-play knowledge injection for pre-trained language models

Z Zhang, Z Zeng, Y Lin, H Wang, D Ye, C **ao… - arxiv preprint arxiv …, 2023 - arxiv.org
Injecting external knowledge can improve the performance of pre-trained language models
(PLMs) on various downstream NLP tasks. However, massive retraining is required to …

Revisiting the knowledge injection frameworks

P Fu, Y Zhang, H Wang, W Qiu, J Zhao - arxiv preprint arxiv:2311.01150, 2023 - arxiv.org
In recent years, large language models (LLMs), such as GPTs, have attained great impact
worldwide. However, how to adapt these LLMs to better suit the vertical domain-specific …

Evaluating open-qa evaluation

C Wang, S Cheng, Q Guo, Y Yue… - Advances in …, 2024 - proceedings.neurips.cc
This study focuses on the evaluation of the Open Question Answering (Open-QA) task,
which can directly estimate the factuality of large language models (LLMs). Current …

Unitabe: Pretraining a unified tabular encoder for heterogeneous tabular data

Y Yang, Y Wang, G Liu, L Wu, Q Liu - arxiv preprint arxiv:2307.09249, 2023 - arxiv.org
Recent advancements in Natural Language Processing (NLP) have witnessed the
groundbreaking impact of pretrained models, yielding impressive outcomes across various …

Lambdakg: A library for pre-trained language model-based knowledge graph embeddings

X **e, Z Li, X Wang, Z **, N Zhang - arxiv preprint arxiv:2210.00305, 2022 - arxiv.org
Knowledge Graphs (KGs) often have two characteristics: heterogeneous graph structure and
text-rich entity/relation information. Text-based KG embeddings can represent entities by …

[PDF][PDF] Keep Skills in Mind: Understanding and Implementing Skills in Commonsense Question Answering.

M Bao, Q Liu, K Zhang, Y Liu, L Yue, L Li, J Zhou - IJCAI, 2023 - staff.ustc.edu.cn
Abstract Commonsense Question Answering (CQA) aims to answer questions that require
human commonsense. Closed-book CQA, as one of the subtasks, requires the model to …

Bridge the gap between language models and tabular understanding

N Chen, L Shou, M Gong, J Pei, C You, J Chang… - arxiv preprint arxiv …, 2023 - arxiv.org
Table pretrain-then-finetune paradigm has been proposed and employed at a rapid pace
after the success of pre-training in the natural language domain. Despite the promising …

Refining Entity Descriptions with Relation Embeddings for Scientific Relation Classification

C Li, X Liu, J Li, J Wang, Z Feng… - 2024 International Joint …, 2024 - ieeexplore.ieee.org
In recent years, the task of Relation Classification (RC) in scientific domains has received
widespread attention. During the fine-tuning phase of Pre-trained Language Models (PLMs) …