Language modeling via stochastic processes
Modern language models can generate high-quality short texts. However, they often
meander or are incoherent when generating longer texts. These issues arise from the next …
meander or are incoherent when generating longer texts. These issues arise from the next …
Contrastive estimation reveals topic posterior information to linear models
Contrastive learning is an approach to representation learning that utilizes naturally
occurring similar and dissimilar pairs of data points to find useful embeddings of data. In the …
occurring similar and dissimilar pairs of data points to find useful embeddings of data. In the …
Leveraging time irreversibility with order-contrastive pre-training
Label-scarce, high-dimensional domains such as healthcare present a challenge for
modern machine learning techniques. To overcome the difficulties posed by a lack of …
modern machine learning techniques. To overcome the difficulties posed by a lack of …
Towards scalable structured data from clinical text
M Agrawal - 2023 - dspace.mit.edu
The adoption of electronic health records (EHRs) presents an incredible opportunity to
improve medicine both at the point-of-care and through retrospective research …
improve medicine both at the point-of-care and through retrospective research …
On the Sequence Evaluation based on Stochastic Processes
Modeling and analyzing long sequences of text is an essential task for Natural Language
Processing. Success in capturing long text dynamics using neural language models will …
Processing. Success in capturing long text dynamics using neural language models will …
TRAPL: Transformer-Based Patch Learning for Enhancing Semantic Representations Using Aggregated Features to Estimate Patch-Class Distribution
We introduce TRAPL, a Transformer-based Patch Learning technique that enhances
semantic representations in segmentation models. TRAPL leverages aggregated features …
semantic representations in segmentation models. TRAPL leverages aggregated features …