Sequence-to-sequence learning as beam-search optimization
Sequence-to-Sequence (seq2seq) modeling has rapidly become an important general-
purpose NLP tool that has proven effective for many text-generation and sequence-labeling …
purpose NLP tool that has proven effective for many text-generation and sequence-labeling …
Zero-shot 3d drug design by sketching and generating
Drug design is a crucial step in the drug discovery cycle. Recently, various deep learning-
based methods design drugs by generating novel molecules from scratch, avoiding …
based methods design drugs by generating novel molecules from scratch, avoiding …
A graph-based framework for structured prediction tasks in Sanskrit
We propose a framework using energy-based models for multiple structured prediction tasks
in Sanskrit. Ours is an arc-factored model, similar to the graph-based parsing approaches …
in Sanskrit. Ours is an arc-factored model, similar to the graph-based parsing approaches …
Word ordering without syntax
Recent work on word ordering has argued that syntactic structure is important, or even
required, for effectively recovering the order of a sentence. We find that, in fact, an n-gram …
required, for effectively recovering the order of a sentence. We find that, in fact, an n-gram …
The state of the art text summarization techniques
MM Saiyyad, NN Patil - … Conference on Computing in Engineering & …, 2022 - Springer
With the advent of communication technology, a tremendous amount of data is generated.
The availability of a vast amount of data provides information and presents the challenge of …
The availability of a vast amount of data provides information and presents the challenge of …
Studying word order through iterative shuffling
As neural language models approach human performance on NLP benchmark tasks, their
advances are widely seen as evidence of an increasingly complex understanding of syntax …
advances are widely seen as evidence of an increasingly complex understanding of syntax …
Learning to organize a bag of words into sentences with neural networks: An empirical study
Sequential information, aka, orders, is assumed to be essential for processing a sequence
with recurrent neural network or convolutional neural network based encoders. However, is …
with recurrent neural network or convolutional neural network based encoders. However, is …
A comparison of neural models for word ordering
We compare several language models for the word-ordering task and propose a new bag-to-
sequence neural model based on attention-based sequence-to-sequence models. We …
sequence neural model based on attention-based sequence-to-sequence models. We …
Improved dependency parsing using implicit word connections learned from unlabeled data
Pre-trained word embeddings and language model have been shown useful in a lot of tasks.
However, both of them cannot directly capture word connections in a sentence, which is …
However, both of them cannot directly capture word connections in a sentence, which is …
Abstractive multi-document summarization by partial tree extraction, recombination and linearization
Existing work for abstractive multidocument summarization utilise existing phrase structures
directly extracted from input documents to generate summary sentences. These methods …
directly extracted from input documents to generate summary sentences. These methods …