Large language models for generative information extraction: A survey
Abstract Information Extraction (IE) aims to extract structural knowledge from plain natural
language texts. Recently, generative Large Language Models (LLMs) have demonstrated …
language texts. Recently, generative Large Language Models (LLMs) have demonstrated …
Looking right is sometimes right: Investigating the capabilities of decoder-only llms for sequence labeling
Pre-trained language models based on masked language modeling (MLM) excel in natural
language understanding (NLU) tasks. While fine-tuned MLM-based encoders consistently …
language understanding (NLU) tasks. While fine-tuned MLM-based encoders consistently …
MedDec: A Dataset for Extracting Medical Decisions from Discharge Summaries
Medical decisions directly impact individuals' health and well-being. Extracting decision
spans from clinical notes plays a crucial role in understanding medical decision-making …
spans from clinical notes plays a crucial role in understanding medical decision-making …
Joint Multi-Facts Reasoning Network for Complex Temporal Question Answering Over Knowledge Graph
Temporal Knowledge Graph (TKG) is an extension of regular knowledge graph by attaching
the time scope. Existing temporal knowledge graph question answering (TKGQA) models …
the time scope. Existing temporal knowledge graph question answering (TKGQA) models …
Reliable data generation and selection for low-resource relation extraction
Automated construction of annotated data holds significant importance in Relation Extraction
(RE) tasks due to the hardness and cost of human annotation. In this work, we propose Self …
(RE) tasks due to the hardness and cost of human annotation. In this work, we propose Self …
Improving Low-resource Prompt-based Relation Representation with Multi-view Decoupling Learning
Recently, prompt-tuning with pre-trained language models (PLMs) has demonstrated the
significantly enhancing ability of relation extraction (RE) tasks. However, in low-resource …
significantly enhancing ability of relation extraction (RE) tasks. However, in low-resource …
Enhancing Low-Resource Relation Representations through Multi-View Decoupling
Recently, prompt-tuning with pre-trained language models (PLMs) has demonstrated the
significantly enhancing ability of relation extraction (RE) tasks. However, in low-resource …
significantly enhancing ability of relation extraction (RE) tasks. However, in low-resource …
TRUE-UIE: Two Universal Relations Unify Information Extraction Tasks
Abstract Information extraction (IE) encounters challenges due to the variety of schemas and
objectives that differ across tasks. Recent advancements hint at the potential for universal …
objectives that differ across tasks. Recent advancements hint at the potential for universal …
Do Not (Always) Look Right: Investigating the Capabilities of Decoder-Based Large Language Models for Sequence Labeling
Pre-trained language models based on masked language modeling (MLM) objective excel
in natural language understanding (NLU) tasks. While fine-tuned MLM-based encoders …
in natural language understanding (NLU) tasks. While fine-tuned MLM-based encoders …
CogMG: Collaborative Augmentation Between Large Language Model and Knowledge Graph
T Zhou, Y Chen, K Liu, J Zhao - arxiv preprint arxiv:2406.17231, 2024 - arxiv.org
Large language models have become integral to question-answering applications despite
their propensity for generating hallucinations and factually inaccurate content. Querying …
their propensity for generating hallucinations and factually inaccurate content. Querying …