Recent advances in natural language processing via large pre-trained language models: A survey

B Min, H Ross, E Sulem, APB Veyseh… - ACM Computing …, 2023 - dl.acm.org
Large, pre-trained language models (PLMs) such as BERT and GPT have drastically
changed the Natural Language Processing (NLP) field. For numerous NLP tasks …

Semantic probabilistic layers for neuro-symbolic learning

K Ahmed, S Teso, KW Chang… - Advances in …, 2022 - proceedings.neurips.cc
We design a predictive layer for structured-output prediction (SOP) that can be plugged into
any neural network guaranteeing its predictions are consistent with a set of predefined …

Grammar-constrained decoding for structured NLP tasks without finetuning

S Geng, M Josifoski, M Peyrard, R West - arxiv preprint arxiv:2305.13971, 2023 - arxiv.org
Despite their impressive performance, large language models (LMs) still struggle with
reliably generating complex output structures when not finetuned to follow the required …

Self-attention networks can process bounded hierarchical languages

S Yao, B Peng, C Papadimitriou… - arxiv preprint arxiv …, 2021 - arxiv.org
Despite their impressive performance in NLP, self-attention networks were recently proved
to be limited for processing formal languages with hierarchical structure, such as $\mathsf …

BiSyn-GAT+: Bi-syntax aware graph attention network for aspect-based sentiment analysis

S Liang, W Wei, XL Mao, F Wang, Z He - arxiv preprint arxiv:2204.03117, 2022 - arxiv.org
Aspect-based sentiment analysis (ABSA) is a fine-grained sentiment analysis task that aims
to align aspects and corresponding sentiments for aspect-specific sentiment polarity …

Pushing on text readability assessment: A transformer meets handcrafted linguistic features

BW Lee, YS Jang, JHJ Lee - arxiv preprint arxiv:2109.12258, 2021 - arxiv.org
We report two essential improvements in readability assessment: 1. three novel features in
advanced semantics and 2. the timely evidence that traditional ML models (eg Random …

Planarized sentence representation for nested named entity recognition

R Geng, Y Chen, R Huang, Y Qin, Q Zheng - Information processing & …, 2023 - Elsevier
One strategy to recognize nested entities is to enumerate overlapped entity spans for
classification. However, current models independently verify every entity span, which …

Fairness-aware structured pruning in transformers

A Zayed, G Mordido, S Shabanian, I Baldini… - Proceedings of the …, 2024 - ojs.aaai.org
The increasing size of large language models (LLMs) has introduced challenges in their
training and inference. Removing model components is perceived as a solution to tackle the …

Nested named entity recognition as latent lexicalized constituency parsing

C Lou, S Yang, K Tu - arxiv preprint arxiv:2203.04665, 2022 - arxiv.org
Nested named entity recognition (NER) has been receiving increasing attention.
Recently,(Fu et al, 2021) adapt a span-based constituency parser to tackle nested NER …

Bottom-up constituency parsing and nested named entity recognition with pointer networks

S Yang, K Tu - arxiv preprint arxiv:2110.05419, 2021 - arxiv.org
Constituency parsing and nested named entity recognition (NER) are similar tasks since
they both aim to predict a collection of nested and non-crossing spans. In this work, we cast …