Sequence-to-sequence learning as beam-search optimization

S Wiseman, AM Rush - arxiv preprint arxiv:1606.02960, 2016 - arxiv.org
Sequence-to-Sequence (seq2seq) modeling has rapidly become an important general-
purpose NLP tool that has proven effective for many text-generation and sequence-labeling …

Zero-shot 3d drug design by sketching and generating

S Long, Y Zhou, X Dai, H Zhou - Advances in Neural …, 2022 - proceedings.neurips.cc
Drug design is a crucial step in the drug discovery cycle. Recently, various deep learning-
based methods design drugs by generating novel molecules from scratch, avoiding …

A graph-based framework for structured prediction tasks in Sanskrit

A Krishna, B Santra, A Gupta, P Satuluri… - Computational …, 2021 - direct.mit.edu
We propose a framework using energy-based models for multiple structured prediction tasks
in Sanskrit. Ours is an arc-factored model, similar to the graph-based parsing approaches …

Word ordering without syntax

A Schmaltz, AM Rush, SM Shieber - arxiv preprint arxiv:1604.08633, 2016 - arxiv.org
Recent work on word ordering has argued that syntactic structure is important, or even
required, for effectively recovering the order of a sentence. We find that, in fact, an n-gram …

The state of the art text summarization techniques

MM Saiyyad, NN Patil - … Conference on Computing in Engineering & …, 2022 - Springer
With the advent of communication technology, a tremendous amount of data is generated.
The availability of a vast amount of data provides information and presents the challenge of …

Studying word order through iterative shuffling

N Malkin, S Lanka, P Goel, N Jojic - arxiv preprint arxiv:2109.04867, 2021 - arxiv.org
As neural language models approach human performance on NLP benchmark tasks, their
advances are widely seen as evidence of an increasingly complex understanding of syntax …

Learning to organize a bag of words into sentences with neural networks: An empirical study

C Tao, S Gao, J Li, Y Feng, D Zhao… - Proceedings of the 2021 …, 2021 - aclanthology.org
Sequential information, aka, orders, is assumed to be essential for processing a sequence
with recurrent neural network or convolutional neural network based encoders. However, is …

A comparison of neural models for word ordering

E Hasler, F Stahlberg, M Tomalin, A de Gispert… - arxiv preprint arxiv …, 2017 - arxiv.org
We compare several language models for the word-ordering task and propose a new bag-to-
sequence neural model based on attention-based sequence-to-sequence models. We …

Improved dependency parsing using implicit word connections learned from unlabeled data

W Wang, B Chang, M Mansur - Proceedings of the 2018 …, 2018 - aclanthology.org
Pre-trained word embeddings and language model have been shown useful in a lot of tasks.
However, both of them cannot directly capture word connections in a sentence, which is …

Abstractive multi-document summarization by partial tree extraction, recombination and linearization

LJ Kurisinkel, Y Zhang, V Varma - Proceedings of the Eighth …, 2017 - aclanthology.org
Existing work for abstractive multidocument summarization utilise existing phrase structures
directly extracted from input documents to generate summary sentences. These methods …