Interpreting deep learning models in natural language processing: A review

X Sun, D Yang, X Li, T Zhang, Y Meng, H Qiu… - arxiv preprint arxiv …, 2021 - arxiv.org
Neural network models have achieved state-of-the-art performances in a wide range of
natural language processing (NLP) tasks. However, a long-standing criticism against neural …

Grammar prompting for domain-specific language generation with large language models

B Wang, Z Wang, X Wang, Y Cao… - Advances in Neural …, 2023 - proceedings.neurips.cc
Large language models (LLMs) can learn to perform a wide range of natural language tasks
from just a handful of in-context examples. However, for generating strings from highly …

Compositionality decomposed: How do neural networks generalise?

D Hupkes, V Dankers, M Mul, E Bruni - Journal of Artificial Intelligence …, 2020 - jair.org
Despite a multitude of empirical studies, little consensus exists on whether neural networks
are able to generalise compositionally, a controversy that, in part, stems from a lack of …

Discrete opinion tree induction for aspect-based sentiment analysis

C Chen, Z Teng, Z Wang, Y Zhang - … of the 60th Annual Meeting of …, 2022 - aclanthology.org
Dependency trees have been intensively used with graph neural networks for aspect-based
sentiment classification. Though being effective, such methods rely on external dependency …

Tree transformer: Integrating tree structures into self-attention

YS Wang, HY Lee, YN Chen - arxiv preprint arxiv:1909.06639, 2019 - arxiv.org
Pre-training Transformer from large-scale raw texts and fine-tuning on the desired task have
achieved state-of-the-art results on diverse NLP tasks. However, it is unclear what the …

Compound probabilistic context-free grammars for grammar induction

Y Kim, C Dyer, AM Rush - arxiv preprint arxiv:1906.10225, 2019 - arxiv.org
We study a formalization of the grammar induction problem that models sentences as being
generated by a compound probabilistic context-free grammar. In contrast to traditional …

Uncertainty in natural language generation: From theory to applications

J Baan, N Daheim, E Ilia, D Ulmer, HS Li… - arxiv preprint arxiv …, 2023 - arxiv.org
Recent advances of powerful Language Models have allowed Natural Language
Generation (NLG) to emerge as an important technology that can not only perform traditional …

Zero-shot 3d drug design by sketching and generating

S Long, Y Zhou, X Dai, H Zhou - Advances in Neural …, 2022 - proceedings.neurips.cc
Drug design is a crucial step in the drug discovery cycle. Recently, various deep learning-
based methods design drugs by generating novel molecules from scratch, avoiding …

Are pre-trained language models aware of phrases? simple but strong baselines for grammar induction

T Kim, J Choi, D Edmiston, S Lee - arxiv preprint arxiv:2002.00737, 2020 - arxiv.org
With the recent success and popularity of pre-trained language models (LMs) in natural
language processing, there has been a rise in efforts to understand their inner workings. In …

Assessing phrasal representation and composition in transformers

L Yu, A Ettinger - arxiv preprint arxiv:2010.03763, 2020 - arxiv.org
Deep transformer models have pushed performance on NLP tasks to new limits, suggesting
sophisticated treatment of complex linguistic inputs, such as phrases. However, we have …