Infoprompt: Information-theoretic soft prompt tuning for natural language understanding
Soft prompt tuning achieves superior performances across a wide range of few-shot tasks.
However, the performances of prompt tuning can be highly sensitive to the initialization of …
However, the performances of prompt tuning can be highly sensitive to the initialization of …
Speechprompt v2: Prompt tuning for speech classification tasks
Prompt tuning is a technology that tunes a small set of parameters to steer a pre-trained
language model (LM) to directly generate the output for downstream tasks. Recently, prompt …
language model (LM) to directly generate the output for downstream tasks. Recently, prompt …
Towards AI-Driven Healthcare: Systematic Optimization, Linguistic Analysis, and Clinicians' Evaluation of Large Language Models for Smoking Cessation …
Creating intervention messages for smoking cessation is a labor-intensive process.
Advances in Large Language Models (LLMs) offer a promising alternative for automated …
Advances in Large Language Models (LLMs) offer a promising alternative for automated …
Compositional Kronecker Context Optimization for Vision-Language Models
K Ding, X Li, Q Yu, Y Wang, H Zhang… - arxiv preprint arxiv …, 2024 - arxiv.org
Context Optimization (CoOp) has emerged as a simple yet effective technique for adapting
CLIP-like vision-language models to downstream image recognition tasks. Nevertheless …
CLIP-like vision-language models to downstream image recognition tasks. Nevertheless …
An Automatic Prompt Generation System for Tabular Data Tasks
Efficient processing of tabular data is important in various industries, especially when
working with datasets containing a large number of columns. Large language models …
working with datasets containing a large number of columns. Large language models …