Large language models for generative information extraction: A survey

D Xu, W Chen, W Peng, C Zhang, T Xu, X Zhao… - Frontiers of Computer …, 2024 - Springer
Abstract Information Extraction (IE) aims to extract structural knowledge from plain natural
language texts. Recently, generative Large Language Models (LLMs) have demonstrated …

Looking right is sometimes right: Investigating the capabilities of decoder-only llms for sequence labeling

D Dukić, J Šnajder - Findings of the Association for Computational …, 2024 - aclanthology.org
Pre-trained language models based on masked language modeling (MLM) excel in natural
language understanding (NLU) tasks. While fine-tuned MLM-based encoders consistently …

MedDec: A Dataset for Extracting Medical Decisions from Discharge Summaries

M Elgaar, J Cheng, N Vakil, H Amiri, LA Celi - arxiv preprint arxiv …, 2024 - arxiv.org
Medical decisions directly impact individuals' health and well-being. Extracting decision
spans from clinical notes plays a crucial role in understanding medical decision-making …

Joint Multi-Facts Reasoning Network for Complex Temporal Question Answering Over Knowledge Graph

R Huang, W Wei, X Qu, W **e, X Mao… - ICASSP 2024-2024 …, 2024 - ieeexplore.ieee.org
Temporal Knowledge Graph (TKG) is an extension of regular knowledge graph by attaching
the time scope. Existing temporal knowledge graph question answering (TKGQA) models …

Reliable data generation and selection for low-resource relation extraction

J Yu, X Wang, W Chen - Proceedings of the AAAI Conference on …, 2024 - ojs.aaai.org
Automated construction of annotated data holds significant importance in Relation Extraction
(RE) tasks due to the hardness and cost of human annotation. In this work, we propose Self …

Improving Low-resource Prompt-based Relation Representation with Multi-view Decoupling Learning

C Fan, W Wei, X Qu, Z Lu, W **e, Y Cheng… - arxiv preprint arxiv …, 2023 - arxiv.org
Recently, prompt-tuning with pre-trained language models (PLMs) has demonstrated the
significantly enhancing ability of relation extraction (RE) tasks. However, in low-resource …

Enhancing Low-Resource Relation Representations through Multi-View Decoupling

C Fan, W Wei, X Qu, Z Lu, W **e, Y Cheng… - Proceedings of the AAAI …, 2024 - ojs.aaai.org
Recently, prompt-tuning with pre-trained language models (PLMs) has demonstrated the
significantly enhancing ability of relation extraction (RE) tasks. However, in low-resource …

TRUE-UIE: Two Universal Relations Unify Information Extraction Tasks

Y Wang, B Yu, Y Liu, S Lu - … of the 2024 Conference of the North …, 2024 - aclanthology.org
Abstract Information extraction (IE) encounters challenges due to the variety of schemas and
objectives that differ across tasks. Recent advancements hint at the potential for universal …

Do Not (Always) Look Right: Investigating the Capabilities of Decoder-Based Large Language Models for Sequence Labeling

D Dukić, J Šnajder - arxiv preprint arxiv:2401.14556, 2024 - arxiv.org
Pre-trained language models based on masked language modeling (MLM) objective excel
in natural language understanding (NLU) tasks. While fine-tuned MLM-based encoders …

CogMG: Collaborative Augmentation Between Large Language Model and Knowledge Graph

T Zhou, Y Chen, K Liu, J Zhao - arxiv preprint arxiv:2406.17231, 2024 - arxiv.org
Large language models have become integral to question-answering applications despite
their propensity for generating hallucinations and factually inaccurate content. Querying …