Characterizing the impacts of semi-supervised learning for weak supervision

J Li, J Zhang, L Schmidt… - Advances in Neural …, 2023 - proceedings.neurips.cc
Labeling training data is a critical and expensive step in producing high accuracy ML
models, whether training from scratch or fine-tuning. To make labeling more efficient, two …

Shoring up the foundations: Fusing model embeddings and weak supervision

MF Chen, DY Fu, D Adila, M Zhang… - Uncertainty in …, 2022 - proceedings.mlr.press
Foundation models offer an exciting new paradigm for constructing models with out-of-the-
box embeddings and a few labeled examples. However, it is not clear how to best apply …

Classifying unstructured clinical notes via automatic weak supervision

C Gao, M Goswami, J Chen… - Machine Learning for …, 2022 - proceedings.mlr.press
Healthcare providers usually record detailed notes of the clinical care delivered to each
patient for clinical, research, and billing purposes. Due to the unstructured nature of these …

Train and you'll miss it: Interactive model iteration with weak supervision and pre-trained embeddings

MF Chen, DY Fu, F Sala, S Wu, RT Mullapudi… - arxiv preprint arxiv …, 2020 - arxiv.org
Our goal is to enable machine learning systems to be trained interactively. This requires
models that perform well and train quickly, without large amounts of hand-labeled data. We …

[PDF][PDF] Learning with Diverse Forms of Imperfect and Indirect Supervision

B Boecking - 2023 - kilthub.cmu.edu
Abstract Powerful Machine Learning (ML) models trained on large, annotated datasets have
driven impressive advances in fields including natural language processing and computer …

[PDF][PDF] Shoring Up the Foundations: Fusing Model Embeddings and Weak Supervision (Supplementary material)

MF Chen, DY Fu, D Adila, M Zhang, F Sala… - proceedings.mlr.press
Weak supervision is a broad set of techniques using weak sources of signal to supervise
models, such as distant supervision [Takamatsu et al., 2012], co-training methods [Blum and …