Language models in the loop: Incorporating prompting into weak supervision

R Smith, JA Fries, B Hancock, SH Bach - ACM/JMS Journal of Data …, 2024 - dl.acm.org
We propose a new strategy for applying large pre-trained language models to novel tasks
when labeled training data is limited. Rather than apply the model in a typical zero-shot or …

Trusted source alignment in large language models

V Bashlovkina, Z Kuang, R Matthews, E Clifford… - arxiv preprint arxiv …, 2023 - arxiv.org
Large language models (LLMs) are trained on web-scale corpora that inevitably include
contradictory factual information from sources of varying reliability. In this paper, we propose …

Whisper Turns Stronger: Augmenting Wav2Vec 2.0 for Superior ASR in Low-Resource Languages

OH Anidjar, R Marbel, R Yozevitch - arxiv preprint arxiv:2501.00425, 2024 - arxiv.org
Approaching Speech-to-Text and Automatic Speech Recognition problems in low-resource
languages is notoriously challenging due to the scarcity of validated datasets and the …

Leveraging large language models for structure learning in prompted weak supervision

J Su, P Yu, J Zhang, SH Bach - 2023 IEEE International …, 2023 - ieeexplore.ieee.org
Prompted weak supervision (PromptedWS) applies pre-trained large language models
(LLMs) as the basis for labeling functions (LFs) in a weak supervision framework to obtain …

Empirical Analysis for Unsupervised Universal Dependency Parse Tree Aggregation

A Kulkarni, O Eulenstein, Q Li - arxiv preprint arxiv:2403.19183, 2024 - arxiv.org
Dependency parsing is an essential task in NLP, and the quality of dependency parsers is
crucial for many downstream tasks. Parsers' quality often varies depending on the domain …

Cross-task Knowledge Transfer for Extremely Weakly Supervised Text Classification

S Park, K Kim, J Lee - Findings of the Association for …, 2023 - aclanthology.org
Text classification with extremely weak supervision (EWS) imposes stricter supervision
constraints compared to regular weakly supervise classification. Absolutely no labeled …

RACH-Space: Reconstructing Adaptive Convex Hull Space with Applications in Weak Supervision

W Na, A Tasissa - arxiv preprint arxiv:2307.04870, 2023 - arxiv.org
We introduce RACH-Space, an algorithm for labelling unlabelled data in weakly supervised
learning, given incomplete, noisy information about the labels. RACH-Space offers simplicity …

Learning from weak labelers as constraints

V Agrawal, R Pukdee, MF Balcan… - … Conference on Learning … - openreview.net
We study programmatic weak supervision, where in contrast to labeled data, we have
access to\emph {weak labelers}, each of which either abstains or provides noisy labels …

Learning with Constraint-Based Weak Supervision

CG Arachie - 2022 - vtechworks.lib.vt.edu
Recent adaptations of machine learning models in many businesses has underscored the
need for quality training data. Typically, training supervised machine learning systems …

Rach-Space: Novel Ensemble Learning Method With Applications in Weakly Supervised Learning

W Na - 2024 - search.proquest.com
In recent years, machine learning, particularly deep learning models, have seen significant
growth and made impact in various real-world applications. These models bypass the need …