Prompting large language models with the socratic method
EY Chang - 2023 IEEE 13th Annual Computing and …, 2023 - ieeexplore.ieee.org
This paper presents a systematic approach to using the Socratic method in develo**
prompt templates that effectively interact with large language models, including GPT-3 …
prompt templates that effectively interact with large language models, including GPT-3 …
[PDF][PDF] Aspect-based Sentiment Analysis with Opinion Tree Generation.
Existing studies usually extract the sentiment elements by decomposing the complex
structure prediction task into multiple subtasks. Despite their effectiveness, these methods …
structure prediction task into multiple subtasks. Despite their effectiveness, these methods …
" You Are An Expert Linguistic Annotator": Limits of LLMs as Analyzers of Abstract Meaning Representation
Large language models (LLMs) show amazing proficiency and fluency in the use of
language. Does this mean that they have also acquired insightful linguistic knowledge about …
language. Does this mean that they have also acquired insightful linguistic knowledge about …
Knowgl: Knowledge generation and linking from text
We propose KnowGL, a tool that allows converting text into structured relational data
represented as a set of ABox assertions compliant with the TBox of a given Knowledge …
represented as a set of ABox assertions compliant with the TBox of a given Knowledge …
Maximum Bayes Smatch ensemble distillation for AMR parsing
AMR parsing has experienced an unprecendented increase in performance in the last three
years, due to a mixture of effects including architecture improvements and transfer learning …
years, due to a mixture of effects including architecture improvements and transfer learning …
Inducing and using alignments for transition-based AMR parsing
Transition-based parsers for Abstract Meaning Representation (AMR) rely on node-to-word
alignments. These alignments are learned separately from parser training and require a …
alignments. These alignments are learned separately from parser training and require a …
Transparent semantic parsing with Universal Dependencies using graph transformations
Even though many recent semantic parsers are based on deep learning methods, we
should not forget that rule-based alternatives might offer advantages over neural …
should not forget that rule-based alternatives might offer advantages over neural …
Understanding and answering incomplete questions
Voice assistants interrupt people when they pause mid-question, a frustrating interaction that
requires the full repetition of the entire question again. This impacts all users, but particularly …
requires the full repetition of the entire question again. This impacts all users, but particularly …
Hierarchical curriculum learning for amr parsing
Meaning Representation (AMR) parsing aims to translate sentences to semantic
representation with a hierarchical structure, and is recently empowered by pretrained …
representation with a hierarchical structure, and is recently empowered by pretrained …
Amr parsing with instruction fine-tuned pre-trained language models
Instruction fine-tuned language models on a collection of instruction annotated datasets
(FLAN) have shown highly effective to improve model performance and generalization to …
(FLAN) have shown highly effective to improve model performance and generalization to …