Infoprompt: Information-theoretic soft prompt tuning for natural language understanding

J Wu, T Yu, R Wang, Z Song, R Zhang… - Advances in …, 2024 - proceedings.neurips.cc
Soft prompt tuning achieves superior performances across a wide range of few-shot tasks.
However, the performances of prompt tuning can be highly sensitive to the initialization of …

Speechprompt v2: Prompt tuning for speech classification tasks

KW Chang, YK Wang, H Shen, I Kang… - arxiv preprint arxiv …, 2023 - arxiv.org
Prompt tuning is a technology that tunes a small set of parameters to steer a pre-trained
language model (LM) to directly generate the output for downstream tasks. Recently, prompt …

Towards AI-Driven Healthcare: Systematic Optimization, Linguistic Analysis, and Clinicians' Evaluation of Large Language Models for Smoking Cessation …

P Calle, R Shao, Y Liu, ET Hébert, D Kendzor… - Proceedings of the CHI …, 2024 - dl.acm.org
Creating intervention messages for smoking cessation is a labor-intensive process.
Advances in Large Language Models (LLMs) offer a promising alternative for automated …

Compositional Kronecker Context Optimization for Vision-Language Models

K Ding, X Li, Q Yu, Y Wang, H Zhang… - arxiv preprint arxiv …, 2024 - arxiv.org
Context Optimization (CoOp) has emerged as a simple yet effective technique for adapting
CLIP-like vision-language models to downstream image recognition tasks. Nevertheless …

An Automatic Prompt Generation System for Tabular Data Tasks

A Akella, A Manatkar, B Chavda, H Patel - arxiv preprint arxiv:2405.05618, 2024 - arxiv.org
Efficient processing of tabular data is important in various industries, especially when
working with datasets containing a large number of columns. Large language models …